Culture

Bigger brains are smarter, but not by much

The English idiom "highbrow," derived from a physical description of a skull barely able to contain the brain inside of it, comes from a long-held belief in the existence of a link between brain size and intelligence.

For more than 200 years, scientists have looked for such an association. Begun using rough measures, such as estimated skull volume or head circumference, the investigation became more sophisticated in the last few decades when MRIs offered a highly accurate accounting of brain volume.

Yet the connection has remained hazy and fraught, with many studies failing to account for confounding variables, such as height and socioeconomic status. The published studies are also subject to "publication bias," the tendency to publish only more noteworthy findings.

A new study, the largest of its kind, led by Gideon Nave of the University of Pennsylvania's Wharton School and Philipp Koellinger of Vrije Universiteit Amsterdam, has clarified the connection. Using MRI-derived information about brain size in connection with cognitive performance test results and educational-attainment measures obtained from more than 13,600 people, the researchers found that, as previous studies have suggested, a positive relationship does exist between brain volume and performance on cognitive tests. But that finding comes with important caveats.

"The effect is there," says Nave, an assistant professor of marketing at Wharton. "On average, a person with a larger brain will tend to perform better on tests of cognition than one with a smaller brain. But size is only a small part of the picture, explaining about 2 percent of the variability in test performance. For educational attainment the effect was even smaller: an additional 'cup' (100 square centimeters) of brain would increase an average person's years of schooling by less than five months." Koellinger says "this implies that factors other than this one single factor that has received so much attention across the years account for 98 percent of the other variation in cognitive test performance."

"Yet, the effect is strong enough that all future studies that will try to unravel the relationships between more fine-grained measures of brain anatomy and cognitive health should control for total brain volume. Thus, we see our study as a small, but important, contribution to better understanding differences in cognitive health."

Nave and Koellinger's collaborators on the work, which is published in the journal Psychological Science, included Joseph Kable, Baird Term Professor in Penn's Department of Psychology; Wi Hoon Jung, a former postdoctoral researcher in Kable's lab; and Richard Karlsson Linnér, a postdoc in Koellinger's lab.

From the outset, the researchers sought to minimize the effects of bias and confounding factors in their research. They pre-registered the study, meaning they published their methods and committed to publishing ahead of time so they couldn't simply bury the results if the findings appeared to be insignificant. Their analyses also systematically controlled for sex, age, height, socioeconomic status, and population structure, measured using the participant's genetics. Height is correlated with higher better cognitive performance, for example, but also with bigger brain size, so their study attempted to zero in on the contribution of brain size by itself.

Earlier studies had consistently identified a correlation between brain size and cognitive performance, but the relationship seemed to grow weaker as studies included more participants, so Nave, Koellinger, and colleagues hoped to pursue the question with a sample size that dwarfed previous efforts.

The study relied on a recently amassed dataset, the UK Biobank, a repository of information from more than half-a-million people across the United Kingdom. The Biobank includes participants' health and genetic information as well as brain scan images of a subset of roughly 20,000 people, a number that is growing by the month.

"This gives us something that never existed before," Koellinger says. "This sample size is gigantic--70 percent larger than all prior studies on this subject put together--and allows us to test the correlation between brain size and cognitive performance with greater reliability."

Measuring cognitive performance is a difficult task, and the researchers note that even the evaluation used in this study has weaknesses. Participants took a short questionnaire that tests logic and reasoning ability but not acquired knowledge, yielding a relatively "noisy" measure of general cognitive performance.

Using a model that incorporated a variety of variables, the team looked to see which were predictive of better cognitive performance and educational attainment. Even controlling for other factors, like height, socioeconomic status, and genetic ancestry, total brain volume was positively correlated with both.

The findings are somewhat intuitive. "It's a simplified analogy, but think of a computer," Nave says. "If you have more transistors, you can compute faster and transmit more information. It may be the same in the brain. If you have more neurons, this may allow you to have a better memory, or complete more tasks in parallel.

"However, things could be much more complex in reality. For example, consider the possibility that a bigger brain, which is highly heritable, is associated with being a better parent. In this case, the association between a bigger brain and test performance may simply reflect the influence of parenting on cognition. We won't be able to get to the bottom of this without more research."

One of the notable findings of the analysis related to differences between male and females. "Just like with height, there is a pretty substantial difference between males and females in brain volume, but this doesn't translate into a difference in cognitive performance," Nave says.

A more nuanced look at the brain scans may explain this result. Other studies have reported that in females, the cerebral cortex, the outer layer of the front part of the brain, tends to be thicker than in males.

"This might account for the fact that, despite having relatively smaller brains on average, there is no effective difference in cognitive performance between males and females," Nave says. "And of course, many other things could be going on."

The authors underscore that the overarching correlation between brain volume and "braininess" was a weak one; no one should be measuring job candidates' head sizes during the hiring process, Nave jokes. Indeed, what stands out from the analysis is how little brain volume seems to explain. Factors such as parenting style, education, nutrition, stress, and others are likely major contributors that were not specifically tested in the study.

"Previous estimates of the relationship between brain size and cognitive abilities were uncertain enough that true relationship could have been practically very important, or, alternatively, not much different from zero," says Kable. "Our study allows the field to be much more confident about the size of this effect and its relative importance moving forward."

In follow-up work, the researchers plan to zoom in to determine whether certain regions of the brain, or connectivity between them, play an outsize role in contributing to cognition.

They're also hopeful that a deeper understanding of the biological underpinnings of cognitive performance can help shine a light on environmental factors that contribute, some of which can be influenced by individual actions or government policies.

"Suppose you have necessary biology to become a fantastic golf or tennis player, but you never have the opportunity to play, so you never realize your potential," Nave says.

Adds Koellinger: "We're hopeful that, if we can understand the biological factors that are linked to cognitive performance, it will allow us to identify the environmental circumstances under which people can best manifest their potential and remain cognitively health. We've just started to scratch the surface of the iceberg here."

Credit: 
University of Pennsylvania

African-American mothers rate boys higher for ADHD

image: George DuPaul is a professor of school psychology at Lehigh University.

Image: 
Courtesy of Lehigh University

African-American children often are reported by parents and teachers to display behaviors of ADHD at a higher rate than children from other racial and ethnic groups. For the first time, researchers have found that African-American mothers in a study rated boys as displaying more frequent ADHD symptoms than Caucasian mothers did, regardless of child race. The findings mean that racial differences found in prior studies may be more due to maternal race than child race, said researcher George DuPaul of Lehigh University.

The findings are reported in this month's Journal of Attention Disorders, in a special print issue on Parenting Youth with ADHD.

"The primary takeaway is that common psychological assessment measures like parent behavior questionnaires are influenced by race; these assessments are not happening in a cultural vacuum," said DuPaul, a professor of school psychology who co-authored the paper, "Impact of Maternal and Child Race on Maternal Ratings of ADHD Symptoms in Black and White Boys," led by Ph.D. candidate Charles Barrett, now a school psychologist. "Hopefully these findings will alert professionals conducting evaluations that race (particularly parental race) makes a difference in how children's behavior is viewed."

Although the study focused on assessing ADHD symptoms in African-American and Caucasian boys, the findings have significant implications for the manner in which school psychologists and other professionals understand children's behavior for a host of suspected disabilities, said Barrett, who is lead school psychologist with Loudon County Public Schools in northern Virginia. "Most notably, our research underscores the importance of evaluators critically examining how factors that are beyond child behavior can influence diagnostic outcomes," Barrett said.

Results shed new light on ADHD diagnosis

For the study, African-American and Caucasian mothers watched a brief video of either an African-American or Caucasian boy displaying behaviors associated with Attention Deficit Hyperactivity Disorder (ADHD), such as inattention, hyperactivity and impulsivity. Mothers then completed a brief rating of the frequency of ADHD symptoms.

Mothers were randomly assigned to view either a Caucasian or African-American boy, making up four groups: African-American mothers viewing an African-American boy; African-American mothers viewing a Caucasian boy; Caucasian mothers viewing an African-American boy; and Caucasian mothers viewing a Caucasian boy. Children followed a script so that their behaviors were virtually identical.

Researchers found significant differences in ADHD symptom ratings as a function of maternal race, but not based on child race or the interaction between maternal and child race.

Based on past studies showing that African-American children, particularly boys, consistently receive higher scores on common ADHD assessment measures than Caucasian children do, the researchers expected child race to be the driving factor in accounting for differences in ratings of ADHD symptoms. Instead, "only small, nonstatistically significant effects for child race were found and the interaction between child and maternal race was also associated with small, nonstatistically significant effects," they said. "Differences in ADHD symptom ratings were influenced almost entirely by maternal race."

Accurate ADHD diagnosis is vital to treatment

Affecting 3 percent to 10 percent of the child population, ADHD is a chronic, neurodevelopmental disorder associated with significant academic and/or social impairment, according to the American Psychiatric Association. Thus, it is critical to identify children with ADHD in a reliable, valid fashion so that treatment resources, such as behavioral therapy, medication and behavioral parent training, are allocated to those in greatest need.

Current best practice in ADHD diagnosis involves multiple assessment methods and respondents to document symptoms and impairment across home and school settings, including observations by parents and teachers.

While several studies have found that adults rate the severity of ADHD symptoms higher in African-American children than in Caucasian children, studies have also consistently found significantly lower diagnostic and treatment rates for ADHD in the African-American population. "Given the apparent disparities in diagnosis and treatment of ADHD, it is critical to more systematically examine racial differences in assessment outcomes," the researchers said.

While other studies have used similar designs to look at differences in assessment of ADHD symptoms across countries, this is the first to look at racial differences in the United States, DuPaul said.

Participants in the study were 131 mothers recruited in suburban and urban locations in the Northeast, Mid-Atlantic, Southeastern and Midwest regions of the U.S. About half of participants were Caucasian and half African-American, with both racial groups primarily from middle class socio-economic status.

Mental health professionals and others who work with children with ADHD are most likely to benefit from these findings, researchers said, but it is also important that parents, teachers and the general public are aware of the influence of race on the ADHD evaluation process.

As a school-based practitioner, Barrett and his colleagues often focus on children's behavior, when child behavior is not the only variable to consider, Barrett said. "As the ultimate goal of assessment is not necessarily to uncover what is different about the child, willingness to explore and examine other variables leads to an equally important question of assessment: why child behavior is perceived differently by different people," he said.

The study results are consistent with prior findings that assessment measures used to identify ADHD may produce quantitatively different results for African-American and Caucasian children who have similar treatment needs, meaning that clinicians should not rely on a single method or respondent when screening or assessing for ADHD.

"Given consistent evidence of health disparities across racial groups with respect to ADHD, it is important to understand and address variables related to racial differences and to develop assessment measures that will provide equivalent data and lead to accurate diagnostic decisions across racial and ethnice subgroups," the researchers said.

Credit: 
Lehigh University

A cancer drug may help treat human papillomavirus infections

image: This is Sanjib Banerjee.

Image: 
UAB

BIRMINGHAM, Ala. - Preclinical experiments by University of Alabama at Birmingham researchers suggest the cancer drugs vorinostat, belinostat and panobinostat might be repurposed to treat infections caused by human papillomaviruses, or HPVs.

HPV infections caused an estimated 266,000 deaths from cervical cancer worldwide in 2012, according to the World Health Organization. Routine screening by Pap smears or HPV DNA tests has reduced death rates in developed countries compared to less developed regions of the globe. Still, an estimated 12,200 United States women are diagnosed with cervical cancer each year.

Highly efficacious vaccines against HPV infection exist -- including the recently approved Gardasil 9, which immunizes against nine genotypes of HPV known to cause cervical, vulvar, vaginal and anal cancers, and genital warts. But the vaccine needs to be given before a person becomes sexually active, since it has no therapeutic efficacy against existing HPV infections.

"Safe, effective and inexpensive therapeutic agents are urgently needed," said N. Sanjib Banerjee, Ph.D., assistant professor of Biochemistry and Molecular Genetics at UAB and lead author of the vorinostat study.

Epithelium of anogenital sites -- the cervix, penis and anus -- or epithelium of the mouth and throat are sites of HPV infection. But HPVs cannot be propagated in conventional cell culture, hampering the investigation into their pathogenic effects. The laboratory of Louise Chow, Ph.D., and Thomas Broker, Ph.D., in the UAB Department Biochemistry and Molecular Genetics has investigated HPV-host interactions for decades. They discovered that the productive program of HPV depends on differentiation of the epithelium into a full-thickness, squamous epithelium. Furthermore, HPV reactivates host DNA replication in these differentiated cells, such that the replication proteins and substrates become available to support viral DNA amplification.

The Chow and Broker lab re-produced a fully differentiated human squamous epithelium by culturing primary human keratinocytes at an air-media interphase for two to three weeks, a growth they call raft culture. In 2009, their lab developed a breakthrough model for a raft culture of HPV-18-infected primary human keratinocytes, allowing a robust amplification of HPV-18 DNA and production of infective viral progeny. This productive raft culture is an ideal model for preclinical investigation of potential anti-HPV agents.

Banerjee and colleagues hypothesized that inhibitors of histone deacetylases, or HDACs, would inhibit HPV DNA amplification because of their known mechanism of disrupting chromosomal DNA replication. Chromosomal replication requires HDAC alterations of histone proteins, the proteins that act like spools that wind DNA to help package and condense chromosomes and the viral genome. Vorinostat inhibits many HDACs, so it might interrupt not only chromosomal replication but also viral DNA replication.

Using the HPV-18 model raft cultures, the researchers found that vorinostat effectively inhibited HPV-18 DNA amplification and virus production. Importantly, vorinostat also induced the programmed cell death called apoptosis in a fraction of the differentiated cells. Cell death could be attributable to DNA breakage when chromosomal DNA replication was interrupted. Similar results were obtained with two additional HDAC inhibitors, belinostat and panobinostat. In contrast, the differentiated cells of uninfected raft cultures, which do not replicate their DNA, were thus largely spared in the presence of the inhibitors.

The UAB team also examined how vorinostat affected levels and functions of viral oncoproteins, and they described the mechanisms that led to programmed cell death in HPV-18-infected cultures. "On the basis of these detailed studies," Banerjee said, "we suggest that HDAC inhibitors are promising compounds for treating benign HPV infections, abrogating progeny production and hence interrupting infectious transmission."

The UAB team also reported that vorinostat caused extensive cell death in raft cultures of dysplastic and cancer cell lines harboring HPV-16. HPV-16 and HPV-18 are the most prevalent, high-risk HPVs responsible for causing anogenital and oropharyngeal cancers. "But further investigation would be required to verify that these agents could also be useful in treating HPV associated dysplasias and cancers," Banerjee said.

Credit: 
University of Alabama at Birmingham

Thriving reef fisheries continue to provide food despite coral bleaching

image: This is a catch of herbivore species from a trap fisher in Seychelles.

Image: 
James Robinson

Reef fisheries can continue to provide food and income despite corals being lost to climate change, according to new research conducted in the Seychelles.

The unexpected results of a 20-year study into reef fisheries published in the journal Nature Ecology and Evolution this week showed fisheries being maintained despite extreme coral bleaching. Remarkably, rapid proliferation of fishes with low dependence on corals led to catches remaining stable or even increasing.

But the results also showed fishing success was 'patchy' and more dependent on fewer species.

Around six million people fish on coral reefs. Each year their catch - estimated to be between 1.4 and 4.2 million tonnes - provides a critical source of food and income for many millions more. But climate change-driven coral bleaching events, caused by warming seas, are damaging coral habitat and depleting fish biodiversity, which has sparked fears that these vibrant ecosystems will no longer support productive fisheries.

A Lancaster University-led study set out to test this, using 20 years of fish abundance, catch and habitat data to assess the long-term impacts of climate-driven coral mass mortality and changes in artisanal coral reef fisheries in the Seychelles.

As part of their study they looked at more than 45,000 daily fishery landing records from 41 different sites. They also conducted 960 underwater surveys at 12 locations.

After the mass coral bleaching event in 1998, which caused substantial loss of coral habitat across Seychelles, reef fish catches have either remained the same and even increased. Although many reefs became overgrown with seaweeds, increases in algal-feeding fish communities such as rabbitfish are enabling local fishers to continue harvesting food.

Dr James Robinson of Lancaster University's Environment Centre said:

"Bleaching in 1998 caused mass coral mortality, habitat collapse, and shifts to seaweed dominance on some reefs, and so we expected the fishery to be in decline. But we overlooked the potential for algal-feeding fish to benefit from higher algal productivity."

"With coral bleaching events becoming more frequent and more intense as the climate warms, the unexpected news was that these fisheries continued to provide benefits for people."

Calvin Gerry of the Seychelles Fishing Authority, a co-author of the study, said: "We focussed on the inshore trap fishery in this study, as it is an important sector in Seychelles, and a common gear on coral reefs globally."

"Most of the fish from the trap fishery are sold and consumed locally, rather than exported internationally. Therefore, changes to this fishery have potential to influence both fishers and consumers domestically."

The study focused on short- and medium-term impacts of climate change. But the researchers have warned that these fisheries may be more unpredictable and variable than before because the fishes contributing to catches were much more patchily distributed.

Declines in healthy coral habitat reduced the diversity of species in catches, and fishers were more reliant on a few highly productive rabbitfish species.

Professor Nicholas Graham of Lancaster University, a co-author of the study, added: "Although we saw that after coral bleaching the average fish catch rose or remained stable, fishing success was patchy. After bleaching, catches became either much larger or much smaller than the average."

"These data from the Seychelles forewarn of changes likely for coral reef fisheries in other countries. While the news for fishers is better than we might expect, the algal-covered reefs are in marked contrast to the complex coral habitats which once hosted myriad and diverse coral reef fishes."

While the rabbitfish boost was shown to give fishers a few years of respite from the effects of widespread coral bleaching, the authors caution that the longer-term outlook for reef fisheries remains uncertain.

Credit: 
Lancaster University

An opioid epidemic may be looming in Mexico -- and the US may be partly responsible

FINDINGS

Opioid use in Mexico has been low, but national and international factors are converging and a threat of increased drug and addiction rates exists. Many of these factors may have originated in the U.S., making this a potential joint U.S.-Mexico epidemic.

BACKGROUND

Previously, various cultural and legislative factors had combined to keep opioid use low in Mexico. These factors included physicians' difficulties in obtaining special prescription pads for controlled substances that included opioids; limits on the number of prescriptions per prescriber; strict guidelines on opioid storage; and the high cost of opioids in Mexico compared with their cost in North America. There was also a perception among Mexicans that opioids are only for the terminally ill, are illegal and are too expensive.

Yet, Mexico's population of residents ages 65 and older is expected to more than double by 2030, which means more people will be diagnosed with chronic diseases or cancer, calling for the use of opioids to relieve pain. In addition, a number of factors could lead to an overall increase in opioids -- and a potential epidemic -- in Mexico. These include: recent legislative changes there that make it easier to prescribe opioids; national health insurance coverage of opioids; pressure from pharmaceutical companies for Mexico to boost prescriptions to make up for stricter regulations and a dwindling market in the U.S.; increased heroin production and trafficking in Mexico; and the deportation of injection drug users to the country.

METHOD

The study was an analytic essay that examined recent changes in legislation as well as population and drug trends in Mexico. The authors studied published academic literature, Mexican federal documents and guidelines, and news reports pertaining to opioid use in Mexico.

IMPACT

Because of the United States' role in contributing to Mexico's opioid problem, this is a potential joint epidemic, said Dr. David Goodman-Meza, clinical instructor in the UCLA Division of Infectious Diseases and the paper's lead author. The U.S. should provide resources for the mitigation of a possible opioid epidemic in the same way the country provides resources for the 'war on drugs' in Mexico -- for instance, by helping to build addiction treatment centers or a system to monitor opioid use.

AUTHORS

The study was a collaboration among experts in substance use, drug policy and infectious diseases from Mexico, the United States and Canada. Other authors are Dr. Raphael Landovitz and Steve Shoptaw of UCLA; Dr. Maria Elena Medina Mora of the Mexican National Psychiatry Institute; Dr. Carlos Magis-Rodriguez of the Mexican National Center of HIV/AIDS; and Dr. Dan Werb of UC San Diego.

JOURNAL

The article will be published in the American Journal of Public Health.

Credit: 
University of California - Los Angeles Health Sciences

Oldest-known ancestor of modern primates may have come from North America, not Asia

image: Teilhardina brandti, a 56-million year-old primate found in Wyoming, may be older than its Asian cousin, previously thought to be the earliest ancestor of modern primates. Unusual tooth sockets in this lower jaw of Teilhardina brandti helped make the determination.

Image: 
Florida Museum image by Paul Morse

GAINESVILLE, Fla. --- About 56 million years ago, on an Earth so warm that palm trees graced the Arctic Circle, a mouse-sized primate known as Teilhardina first curled its fingers around a branch.

The earliest-known ancestor of modern primates, Teilhardina's close relatives would eventually give rise to today's monkeys, apes and humans. But one of the persistent mysteries about this distant cousin of ours is where it originated.

Teilhardina (ty-hahr-DEE'-nuh) species quickly spread across the forests of Asia, Europe and North America, a range unparalleled by all other primates except humans. But where did its journey begin?

New research shows that Teilhardina brandti, a species found in Wyoming, is as old or older than its Asian and European relatives, upending the prevailing hypothesis that Teilhardina first appeared in China.

Teilhardina's origins, however, remain a riddle.

"The scientific conclusion is 'We just don't know,'" said Paul Morse, the study's lead author and a recent University of Florida doctoral graduate. "While the fossils we've found potentially overturn past hypotheses of where Teilhardina came from and where it migrated, they definitely don't offer a clearer scenario."

What is clear, Morse said, is that T. brandti had a wide variety of features, some of which are as primitive as those found in Teilhardina asiatica, its Asian cousin, previously thought to be the oldest species in the genus.

To make this determination, Morse studied 163 teeth and jaws in the most comprehensive analysis of T. brandti to date.

Teeth contain a treasure-trove of information and often preserve better than bone, thanks to their tough enamel. They can reveal clues about an animal's evolutionary past, its size, diet and age as an individual and in geological time.

Primate teeth have particularly distinct structures that are immediately recognizable to the trained eye, said Jonathan Bloch, study co-author and curator of vertebrate paleontology at the Florida Museum of Natural History.

"Identifying differences between primate teeth is not so different from a biker recognizing that a Harley is different from a scooter or an art critic evaluating whether an image was created by Picasso or Banksy," he said. "In detail, they are very different from each other in specific, predictable ways."

While Teilhardina bones are very rare in the fossil record, its teeth are more plentiful - if you know how to find them. Bloch's team of paleontologists, Morse included, have spent years combing the surface of Wyoming's Bighorn Basin on hands and knees and then packing out 50-pound bags of soil to a river to screen wash. The remaining bits of bones and teeth - which can be smaller than a flea - are examined under a microscope back at the museum.

This painstaking search has built up the dental record of T. brandti from a single molar - used to first describe the species in 1993 - to hundreds of teeth, providing a broad look at the primate's population-level variation.

Still, Morse and Bloch were unprepared for the peculiar variation exhibited by specimen UF 333700, a jagged piece of jaw with T. brandti teeth.

"Jon and I started arguing about the alveoli" - empty tooth sockets - "and how they didn't look right at all," said Morse, now a postdoctoral researcher at Duke University. "By the end of the day, we realized that specimen completely overturned both the species definition of T. asiatica and part of the rationale for why it is the oldest Teilhardina species."

Studies based on a small number of teeth simply missed the diversity in Teilhardina's physical characteristics, Morse said.

"There's likely a tremendous amount of variation in the fossil record, but it's extremely difficult to capture and measure when you have a small sample size," he said. "That's one of the reasons collecting additional fossils is so important."

The analysis also reshuffled the Teilhardina family tree, reducing the number of described species from nine to six and reclassifying two species as members of a new genus, Bownomomys, named for prominent vertebrate paleontologist Thomas Bown.

But the precise ages of Teilhardina species are still impossible to pinpoint and may remain that way.

Teilhardina appeared during the geological equivalent of a flash in the pan, a brief 200,000-year period known as the Paleocene-Eocene Thermal Maximum, or PETM. This era was characterized by a massive injection of carbon into the Earth's atmosphere, which sent global temperatures soaring. Sea levels surged by 220 feet, ecosystems were overhauled and the waters at the North Pole warmed to 74 degrees.

Scientists can use the distinct carbon signature of the PETM to locate this period in the rock record, and carbon isotopes in teeth can also be used to identify fossil animals from the era.

But among Teilhardina fossil sites across the globe, only Wyoming has the uninterrupted, neatly demarcated layers of rock that allow paleontologists to hone in on more precise dates.

"The humblest statement would be to say that these species are essentially equivalent in age," Bloch said. "Determining which came earlier in the PETM probably surpasses the level of resolution we have in the rock record. But what we can say is that the only place where you can really establish where Teilhardina appears in this climate event with confidence is in the Bighorn Basin."

As the Earth warmed, plants and animals expanded their ranges northward, returning south as temperatures cooled at the end of the PETM.

"This dance of plants and animals with climate change happened over vast landscapes, with forests moving from the Gulf Coast to the Rocky Mountains in just a few thousand years," Bloch said.

Teilhardina likely tracked the shifts in its forest habitats across the land bridges that then connected North America, Greenland and Eurasia, he said.

"Teilhardina is not throwing its bag over its shoulder and walking," he said. "Its range is shifting from one generation to the next. Over 1,000 years, you get a lot of movement, and over 2,000-3,000 years, you could easily cover continental distances."

While it was well-suited to Earth's hothouse environment, Teilhardina disappeared with the PETM, replaced by new and physically distinct primates. It's a sobering reminder of what can happen to species - including humans - during periods of swift climatic changes, Bloch said.

"A changing planet has dramatic effects on biology, ecosystems and evolution. It's part of the process that has produced the diversity of life we see today and mass extinctions of life that have happened periodically in Earth's history," Bloch said. "One of the unexpected results of global warming 56 million years ago is that it marks the origin of the group that ultimately led to us. How we will fare under future warming scenarios is less certain."

Credit: 
Florida Museum of Natural History

State lawmakers want to loosen childhood vaccine requirements, but legal barriers persist

image: This map shows the total number of pro- and anti-vaccine bills proposed by lawmakers in each state from 2011 to 2017.

Image: 
Drexel University

Despite an uptick in anti-vaccine legislation proposed by state lawmakers in recent years, pro-vaccine bills were more likely to be enacted into law, according to a new study by researchers at Drexel University. The results were published this week in the American Journal of Public Health.

"It is reassuring to know that the legislative process is working in favor of public health. It is concerning that there are so many anti-vaccination bills introduced, but our study shows that those bills are rarely signed into law," said study principal investigator Neal D. Goldstein, PhD, an assistant research professor in Drexel's Dornsife School of Public Health.

The use of nonmedical exemptions from vaccination requirements increased nationwide by 19 percent from 2009 to 2013, which has led to a disease resurgence in communities across the United States. However, both pro- and anti-vaccination policies vary widely state-by-state. The Drexel study, which analyzed all proposed and enacted vaccine legislation at the state level between 2011 and 2017, offers one of the first in-depth pictures of the country's vaccination policy climate.

"If you only look at current laws, that's history. But analyzing proposed bills gives us a flavor for what's happening now, and perhaps for what's to come. Are we seeing trends that may be concerning for the future?" Goldstein said.

During the seven-year study period, 175 bills related to immunization exemptions were introduced in state legislatures, with the volume increasing significantly over time. In 2011, there were 14 total bills proposed, compared to 41 in 2017.

The researchers found that the majority of vaccination legislation activity between 2011 and 2017 was consolidated to four states: New Jersey (29 total bills), New York (28), West Virginia (15) and Mississippi (12). New Jersey introduced the highest number of anti-vaccination bills (24), while New York and West Virginia introduced 14.

Of the 175 vaccination bills introduced, 92 (53 percent) were classified as anti-vaccine, and 83 (47 percent) were classified as pro-vaccine. Thirteen of the total number of bills (7 percent) were signed into law.

Although the majority of proposed legislation would have expanded access to vaccine exemptions, bills that limited exemptions - meaning they eliminated or made it more difficult for parents to opt their children out of mandatory school immunization requirements - were overwhelmingly more likely to be enacted into law. Only one anti-vaccination bill, 2011 Washington bill SB5005, ultimately became law. The law broadened the types of health care providers, beyond licensed physicians, who could sign a vaccination exemption form.

Pro-vaccine laws are an important protector for the public's health, according to Goldstein, because such a high proportion of the population needs to be vaccinated to prevent an outbreak of contagious diseases. Measles, for example, require about 95 percent of the population to be immunized. Those who choose not to vaccinate their children for non-medical reasons are putting communities at risk, evidenced by states across the country experiencing record-high disease outbreaks this year, Goldstein said.

The recent anti-vaccination movement gained momentum after a study published in The Lancet in 1997, which suggested a link between the MMR (measles, mumps and rubella) vaccine and autism spectrum disorder. The study was subsequently debunked and retracted, and its author, Andrew Wakefield, lost his medical license.

However, that has not stopped a small, vocal minority of Americans from continuing to spread misinformation about the perceived health risks of vaccines. And Goldstein's recent study shows that the dangerous rhetoric has found its way into state legislatures.

New Jersey Assembly Bill 497, for example, would have exempted children under six years of age from the hepatitis B vaccine requirement if the child's mother tested negative for hepatitis B during her pregnancy. The bill explicitly linked "multiple sclerosis, chronic arthritis, autism spectrum disorder, and diabetes" as a "diseases or adverse unintended consequences associated with receipt of the hepatitis B vaccine." There is no scientific evidence to support the bill's claims, Goldstein said.

"Several of the bills we saw were clearly not evidenced-based," he added. "This serves as an opportunity for pro-vaccination constituents to become involved in the legislative process and ensure that state laws reflect the state of science."

Credit: 
Drexel University

It's not a shock: Better bandage promotes powerful healing

image: Materials science and engineering professor Xudon Wang fits a new wound dressing around the wrist of graduate student Yin Long. The device stimulates healing using electricity generated from the body's natural motions.

Image: 
UW-Madison photo by Sam Million-Weaver

MADISON - A new, low-cost wound dressing developed by University of Wisconsin-Madison engineers could dramatically speed up healing in a surprising way.

The method leverages energy generated from a patient's own body motions to apply gentle electrical pulses at the site of an injury.

In rodent tests, the dressings reduced healing times to a mere three days compared to nearly two weeks for the normal healing process.

"We were surprised to see such a fast recovery rate," says Xudong Wang, a professor of materials science and engineering at UW-Madison. "We suspected that the devices would produce some effect, but the magnitude was much more than we expected."

Wang and collaborators described their wound dressing method today (Nov. 29, 2018) in the journal ACS Nano.

Researchers have known for several decades that electricity can be beneficial for skin healing, but most electrotherapy units in use today require bulky electrical equipment and complicated wiring to deliver powerful jolts of electricity.

"Acute and chronic wounds represent a substantial burden in healthcare worldwide," says collaborator Angela Gibson, professor of surgery at UW-Madison and a burn surgeon and director of wound healing services at UW Health. "The use of electrical stimulation in wound healing is uncommon."

In contrast with existing methods, the new dressing is much more straightforward.

"Our device is as convenient as a bandage you put on your skin," says Wang.

The new dressings consist of small electrodes for the injury site that are linked to a band holding energy-harvesting units called nanogenerators, which are looped around a wearer's torso. The natural expansion and contraction of the wearer's ribcage during breathing powers the nanogenerators, which deliver low-intensity electric pulses.

"The nature of these electrical pulses is similar to the way the body generates an internal electric field," says Wang.

And, those low-power pulses won't harm healthy tissue like traditional, high-power electrotherapy devices might.

In fact, the researchers showed that exposing cells to high-energy electrical pulses caused them to produce almost five times more reactive oxygen species -- major risk factors for cancer and cellular aging -- than did cells that were exposed to the nanogenerators.

Also a boon to healing: They determined that the low-power pulses boosted viability for a type of skin cell called fibroblasts, and exposure to the nanogenerator's pulses encouraged fibroblasts to line up (a crucial step in wound healing) and produce more biochemical substances that promote tissue growth.

"These findings are very exciting," says collaborator Weibo Cai, a professor of radiology at UW-Madison. "The detailed mechanisms will still need to be elucidated in future work."

In that vein, the researchers aim to tease out precisely how the gentle pulses aid in healing. The scientists also plan to test the devices on pig skin, which closely mimics human tissue.

And, they are working to give the nanogenerators additional capabilities--tweaking their structure to allow for energy harvesting from small imperceptible twitches in the skin or the thrumming pulse of a heartbeat.

"The impressive results in this study represent an exciting new spin on electrical stimulation for many different wound types, given the simplicity of the design," says Gibson, who will collaborate with the team to confirm the reproducibility of these results in human skin models.

If the team is successful, the devices could help solve a major challenge for modern medicine.

"We think our nanogenerator could be the most effective electrical stimulation approach for many therapeutic purposes," says Wang.

And because the nanogenerators consist of relatively common materials, price won't be an issue.

"I don't think the cost will be much more than a regular bandage," says Wang. "The device in itself is very simple and convenient to fabricate."

Credit: 
University of Wisconsin-Madison

Climate change and air pollution damaging health and causing millions of premature deaths

image: These are health impacts of exposure to ambient fine particulate matter (PM2.5) in 2015, by key sources of pollution. Coal as a fuel is highlighted by hatching.

Image: 
© IIASA

IIASA researchers have contributed to a major new report in The Lancet medical journal looking at the effects of climate change on human health, and the implications for society.

The 2018 Report of the research coalition The Lancet Countdown: Tracking Progress on Health and Climate Change shows that rising temperatures as a result of climate change are already exposing us to an unacceptably high health risk and warns, for the first time, that older people in Europe and the East Mediterranean are particularly vulnerable to extremes of heat, markedly higher than in Africa and SE Asia. The risk in Europe and the Eastern Mediterranean stems from aging populations living in cities, with 42% and 43% of over-65s respectively vulnerable to heat. In Africa, 38% are thought to be vulnerable, while in Asia it is 34%.

The report also states that ambient air pollution resulted in several million premature deaths from ambient fine particulate matter globally in 2015, a conclusion from IIASA researchers confirming earlier assessments. Since air pollution and greenhouse gases often share common sources, mitigating climate change constitutes a major opportunity for direct human health benefits.

Leading doctors, academics and policy professionals from 27 organizations have contributed analysis and jointly authored the report. Alongside IIASA, the partners behind the research include the World Bank, World Health Organization (WHO), University College London and Tsinghua University, among others.

IIASA researcher Gregor Kiesewetter led a team from the Air Pollution and Greenhouse Gases research program that estimated the dangers of air pollution to human health. A new and important finding this year was the global attribution of deaths to source. Kiesewetter and the team found that coal alone accounts for 16% of pollution-related premature deaths, around 460,000, which they state makes phasing out coal-use a "crucial no-regret intervention for public health".

Kiesewetter and the team used the GAINS Model, developed at IIASA, which calculates the emissions of precursors of particulate matter based on a detailed breakdown of economic sectors and fuels used.

Large contributions to ambient air pollution come from the residential sector, mostly from solid fuels like biomass and coal. Industry, electricity generation, transport, and agriculture are also important contributors. While coal should be a key target for early phase-out in households and electricity generation as it is highly polluting, it is not all that should be done.

"The attribution shows that unfortunately an approach targeting a single sector or fuel won't solve the problem - air pollution is a multi-faceted issue that needs integrated strategies cutting across many sectors, which will differ from country to country. This is what we typically do with the regional and local GAINS model: giving advice to policymakers on the most efficient approaches to tackle air pollution in their specific settings," says Kiesewetter.

The report contains a number of other headline findings: -

157m more vulnerable people were subjected to a heatwave in 2017 than in 2000, and 18m more than in 2016.

153bn hours of work were lost in 2017 due to extreme heat as a result of climate change. China alone lost 21bn hours, the equivalent of a year's work for 1.4% of their working population. India lost 75bn hours, equivalent to 7% of their total working population.

Heat greatly exacerbates urban air pollution, with 97% of cities in low- and middle- income countries not meeting WHO air quality guidelines.

Heat stress, an early and severe effect of climate change, is commonplace and we, and the health systems we rely on, are ill equipped to cope.

Rising temperatures and unseasonable warmth is responsible for cholera and dengue fever spreading, with vectorial capacity for their transmission increasing across many endemic areas.

The mean global temperature change to which humans are exposed is more than double the global average change, with temperatures rising 0.8°C versus 0.3°C.

Hugh Montgomery, co-chair of The Lancet Countdown on Health and Climate Change and director of the Institute for Human Health and Performance, University College London says: "Heat stress is hitting hard - particularly amongst the urban elderly, and those with underlying health conditions such as cardiovascular disease, diabetes or chronic kidney disease. In high temperatures, outdoor work, especially in agriculture, is hazardous. Areas from Northern England and California, to Australia are seeing savage fires with direct deaths, displacement and loss of housing as well as respiratory impacts from smoke inhalation."

The report, which looks at 41 separate indicators across a range of themes, says urgent steps are needed to protect people now from the impacts of climate change. In particular, stronger labor regulations are needed to protect workers from extremes of heat and hospitals and the health systems we rely on need to be better equipped for extreme heat so they are able to cope. But the report also stresses that there are limits to adapting to the temperature increases, and if left unabated, climate change and heat will overwhelm even the strongest of systems, so the need for reducing greenhouse gas emissions is critical.

2018 has been an even hotter year in many parts of the world and the World Weather Attribution Study for northern Europe showed this summer's heat wave was twice as likely to have happened as a result of man-made climate change. Of the 478 global cities surveyed in the report, 51% expect climate change to seriously compromise their public health infrastructure.

"The world has yet to effectively cut its emissions. The speed of climate change threatens our, and our children's lives. Following current trends we exhaust our carbon budget required to keep warming below two degrees, by 2032. The health impacts of climate change above this level above this level threaten to overwhelm our emergency and health services," says Anthony Costello, co-chair of The Lancet Countdown.

Other findings of the report include: a new indicator mapping extremes of precipitation that identifies South America and southeast Asia among the regions most exposed to flood and drought and, on food security, the report points to 30 countries experiencing downward trends in crop yields, reversing a decade-long trend that had previously seen global improvement. Yield potential is estimated to be declining in every region as extremes of weather become more frequent and more extreme.

Credit: 
International Institute for Applied Systems Analysis

Flexible electronic skin aids human-machine interactions

image: An electronic skin could help robots and prosthetic devices, such as this 3D-printed model, mimic the sense of touch. Watch this Headlines Science video here.

Image: 
American Chemical Society

Human skin contains sensitive nerve cells that detect pressure, temperature and other sensations that allow tactile interactions with the environment. To help robots and prosthetic devices attain these abilities, scientists are trying to develop electronic skins. Now researchers report a new method in ACS Applied Materials & Interfaces that creates an ultrathin, stretchable electronic skin, which could be used for a variety of human-machine interactions. See a video of the e-skin here.

Electronic skin could be used for many applications, including prosthetic devices, wearable health monitors, robotics and virtual reality. A major challenge is transferring ultrathin electrical circuits onto complex 3D surfaces and then having the electronics be bendable and stretchable enough to allow movement. Some scientists have developed flexible "electronic tattoos" for this purpose, but their production is typically slow, expensive and requires clean-room fabrication methods such as photolithography. Mahmoud Tavakoli, Carmel Majidi and colleagues wanted to develop a fast, simple and inexpensive method for producing thin-film circuits with integrated microelectronics.

In the new approach, the researchers patterned a circuit template onto a sheet of transfer tattoo paper with an ordinary desktop laser printer. They then coated the template with silver paste, which adhered only to the printed toner ink. On top of the silver paste, the team deposited a gallium-indium liquid metal alloy that increased the electrical conductivity and flexibility of the circuit. Finally, they added external electronics, such as microchips, with a conductive "glue" made of vertically aligned magnetic particles embedded in a polyvinyl alcohol gel. The researchers transferred the electronic tattoo to various objects and demonstrated several applications of the new method, such as controlling a robot prosthetic arm, monitoring human skeletal muscle activity and incorporating proximity sensors into a 3D model of a hand.

Credit: 
American Chemical Society

Understanding Down syndrome opens door to Alzheimer's prevention trials

Clinical trials for preventing Alzheimer's disease in people with Down syndrome may soon be possible thanks to new research from King's College London. The researchers found changes in memory and attention are the earliest signs of Alzheimer's in Down syndrome, and these changes start in the early 40s.

People with Down syndrome have an extremely high risk of developing Alzheimer's disease and cognitive decline starts at a much younger age than the general population. Yet people with Down syndrome have been largely excluded from clinical trials due to a lack of reliable data on how the disease progresses.

The new study, published in Alzheimer's & Dementia: The Journal of the Alzheimer's Association1, is the largest in-depth study of cognitive abilities in people with Down syndrome worldwide, including 312 adults from a diversity of backgrounds. The findings pave the way for prevention trials by identifying the best age for treatments to be given, which symptoms to focus on and how many participants would need to take part in trials.

Andre Strydom, Professor of Intellectual Disabilities at the Institute of Psychiatry, Psychology & Neuroscience (IoPPN), says: 'Our findings show that trials would need relatively low numbers of people - approximately 100 - in order to show whether a drug can delay the symptoms of Alzheimer's, if treatment started in the mid-late 30s. Armed with this information we are one step closer to tackling the major cause of death in people with Down syndrome.'

Around one in 1000 people born in the UK have Down syndrome, a genetic condition caused by having an extra copy of chromosome 21. The extra chromosome includes a gene which controls the production of the protein beta-amyloid which collects in clumps in the brains of people with Alzheimer's. All adults over 35 with Down syndrome also have these clumps of beta-amyloid in their brains and can be considered in the early stages of the disease.

Clinical trials targeting beta-amyloid in people who already have symptoms of Alzheimer's have largely failed, and many scientists believe this is because treatments may only be effective if given during the very early stages of Alzheimer's, before many of the symptoms have started.

Dr Carla Startin, lead researcher for the Alzheimer's & Dementia study, formerly at the IoPPN and now at the University of Surrey, says: 'Because Alzheimer's is so common in people with Down syndrome, this gives us a unique opportunity to help us understand how Alzheimer's disease develops in general. We believe our results could have a considerable impact on the lives of people with Down syndrome and, if drug trials are successful, may also be relevant for the wider population.'

Another study by the same research group, recently published in JAMA Neurology2, showed the huge impact that Alzheimer's has on people with Down syndrome. The researchers found dementia is now the likely underlying cause of death in more than 70% of adults with Down syndrome aged over 35 years.

Rosalyn Hithersay, the lead researcher for the JAMA Neurology study from the IoPPN, says: 'Our findings gives stark evidence of the real need for supporting individuals with Down syndrome as they get older, and also for the urgency to identify treatments to delay or prevent Alzheimer's disease in this population.'

Professor Strydom, the senior researcher on both new studies, says: 'Now that all the pieces are in place, we are working with colleagues in research centres across Europe to start a clinical trial to prevent Alzheimer's in adults with Down syndrome in the next few years.'

Credit: 
King's College London

Nurture over nature: Even with common obesity gene variants, obese children lose weight after lifestyle changes

Children who are genetically predisposed to overweight, due to common gene variants, can still lose weight by changing their diet and exercise habits. Around 750 children and adolescents with overweight or obesity undergoing lifestyle intervention participated in the study conducted by researchers from the University of Copenhagen and Holbæk Hospital.

Overweight and obesity constitute an increasing global problem that may lead to serious sequelae such as heart attacks, diabetes and cancer. In 2016, 124 million children and adolescents worldwide suffered from obesity. Now researchers from the University of Copenhagen and the Children's Obesity Clinic, the Department of Paediatrics at Holbæk Hospital have examined how genetics affect children and young people's ability to lose excess weight.

'We are trying to understand the genetic driving force behind overweight and whether this force also makes it impossible for some to lose weight. We show that a high genetic predisposition to overweight during childhood in fact had no influence on whether the children reacted to lifestyle intervention compared to children with low genetic predisposition to overweight. The 15 genetic variants we have studied are common in the population and are the ones that in general increase a child's risk of becoming overweight,' says Postdoc at the Novo Nordisk Foundation Center for Basic Metabolic Research at UCPH Theresia Maria Schnurr, who is one of the authors of the study.

The new research results have just been published in the scientific journal Obesity. The researchers' aim was to determine the influence of specific gene variants on children and adolescents' ability to lose weight. Therefore, they studied the 15 specific gene variants implicated in childhood obesity and which are common in the population. In the study, the researchers demonstrate that these genetic variants did not predict whether children and adolescents' were able to lose weight when they changed their lifestyle. So far only children with a rare genetic mutation in the MC4R gene do not seem to lose weight when undergoing lifestyle intervention.

Lifestyle Intervention Led to Weight Loss

The researchers examined 754 children and adolescents with overweight and obesity. The median age was 11.6 years. The genetic profile of all participants was mapped, and the researchers then calculated a genetic risk score for childhood overweight for each participant based on the 15 genetic variants. They all carried one or more of the 15 genetic variants associated with increased risk for obesity and overweight during childhood. To determine whether a genetic predisposition for overweight affected the children and adolescents' ability to lose weight the children had to implement a series of lifestyle changes.

They followed a treatment protocol developed at Holbæk Hospital. The protocol centres around the family with behavioural lifestyle changes. For example, the children and adolescents had to change their diet, means of transportation, physical activity, sedentary activity, amount of sleep, consumption of snacks and sweet things and social activities. The intervention lasted six to 24 months. Subsequently, the researchers followed up on the treatment and found that the lifestyle changes had affected the weight of the participants, despite their genetic disposition for overweight and obesity.

'Large parts of the population believes that when you have problematic genes it is game over. That is why it is very important we send a clear message that even though you have a genetic sensitivity this treatment can help people. We have discovered that it does not matter whether the children and adolescents have an increased genetic risk score or not. They can respond to treatment just as well. This means our treatment is efficient despite carrying common obesity risk genes. It gives hope to people with obesity and obesity related complications such as high blood pressure, cholesterol and fatty liver that we can in fact help them,' says one of the study's authors Jens-Christian Holm, doctor and head of the Children's Obesity Clinic, Holbæk Hospital.

Genetic Markers

The genetic variants the researchers have examined are common in the population and turned out not to have an effect on the ability to lose weight during the intervention. So far, the researchers did not find any biological markers for a poor response on lifestyle intervention except for the rare gene MC4R associated with poor response in terms of weight loss following a lifestyle intervention.

'MC4R is a rare genetic mutation and thus the question remains why around 75 percent of children in a group of children receiving the exact same treatment react more positively to the treatment compared to the remaining 25 percent of children not responding to lifestyle treatment. Identifying additional common genetic markers would help us understand the biological pathways that affect obesity and a person's reaction to lifestyle changes - and thus in the long term help us provide even better treatments,' says Professor at the Novo Nordisk Foundation Center for Basic Metabolic Research Torben Hansen, last author of the study.

Credit: 
University of Copenhagen - The Faculty of Health and Medical Sciences

Vapers can avoid relapsing to smoking, even after the odd cigarette

image: This is lead researcher, Dr Caitlin Notley, from UEA's Norwich Medical School.

Image: 
University of East Anglia

Smokers who switch to vaping don't lapse for long - according to new research from the University of East Anglia.

While vapers may occasionally have the odd smoking lapse, they don't see it as 'game over' for their quit attempt and it doesn't have to lead to a full relapse.

The findings, published in the journal Drug and Alcohol Review, suggest that vaping encourages not just smoking cessation, but long-term relapse prevention.

Lead researcher Dr Caitlin Notley, from UEA's Norwich Medical School, said: "E-cigarettes are the most popular aid to quitting smoking in the UK. Our previous research has shown that e-cigarettes are really important for helping people stay smoke free - by substituting the physical, psychological and social aspects of smoking.

"We wanted to know what happens when people who have switched to vaping, lapse back into smoking.

"It's really important to understand this so that we can develop advice, guidance and support to help people stay smoke free long term - particularly as relapsing back to smoking cigarettes is so harmful."

The research team interviewed 40 people who had quit smoking by switching to vaping. Around half reported either brief or regular lapses to tobacco smoking (sometimes called 'dual use') - particularly in social situations.

They found that people think of smoking lapses differently when they have switched to vaping, compared to other quit attempts.

Dr Notley said: "In the past - a brief smoking lapse would almost always lead to a full relapse, and people would usually feel like a failure for slipping up. But this was before people started switching to vaping.

"The difference is that, for some vapers, the odd cigarette was thought of as being 'allowed'. For others, an unintentional cigarette made them even more determined to maintain abstinence in future.

"Either way, it didn't necessarily lead to a full relapse back into smoking.

"Because vaping is a more pleasurable alternative, our research found that a full relapse into smoking isn't inevitable when people find themselves having the odd cigarette.

"There has been a lot of theorising around the process of smoking relapse after quit attempts. But all of these date back to pre-vaping times. This fresh evidence makes us question the usefulness of that understanding now that so many people are choosing to switch to vaping.

"For ex-smokers, vaping offers a pleasurable, social and psychological substitute to cigarettes - and it powerfully alters the threat of relapse. The old 'not a puff' advice may need revisiting."

Credit: 
University of East Anglia

Self-assessing back pain by app just as effective as traditional methods, study shows

image: Patient using an app-based assessment.

Image: 
University of Warwick

Study lends weight to argument for using mobile apps for routine measurements and clinical trials

Digital versions of existing assessments would be cheaper, greener and improve patient experience

Validating the effectiveness of health apps could be first step to a learning health service

Study by University of Warwick supports call by the Royal College of Physicians for greater use of already available technology in healthcare

Patients can assess their own back pain using an app on their phone or tablet as effectively as current paper methods, a new study from the University of Warwick has shown.

The study, published in the open access journal Journal of Medical Internet Research, demonstrates that digital versions of established measurements for assessing back pain are just as reliable and responsive, opening the possibility for their use by patients for routine measurements and clinical trials.

The researchers see this study as a necessary first step in the greater use of digital media in clinical settings, in light of recent calls for greater use of such technology by healthcare providers.

For health issues that can't be readily measured, such as pain and depression, clinicians will often use self-assessment to monitor change. In most cases, this will take the form of a paper-based assessment. These go through very thorough validation exercises to ensure that they measure what they intend to robustly and accurately.

The researchers created mobile app versions of the most commonly-used measures in back pain trials: the Roland Morris Disability Questionnaire (RMDQ), visual analogue scale (VAS) of pain intensity, and numerical rating scale (NRS). These were developed with support from the University of Warwick Higher Education Innovation Fund with the aim of being used in clinical trials and for routine clinical measurements.

Back pain is the number one cause of disability globally, affecting up to 84% of people at some point in their lives. It is estimated that it costs the UK economy billions of pounds each year.

Lead author Dr Robert Froud from the University of Warwick Clinical Trials Unit said: "We have taken existing outcome measures and shown that they can be migrated to digital media and used in that format just as effectively as their paper-based versions. Our intention is to develop technology that allows people to securely complete these kinds of assessments on their own phones and tablets in a way that is safe, secure and accurate.

"If you can accurately monitor in clinical practice what's happening to patients' health, then analytically there is a lot that could be done with the data that will benefit patients. For example, we may be able to detect that particular treatment approaches are working better for certain types of people. We hear a lot about machine learning, but a learning healthcare system is perhaps next.

"The implications are quite big because we can aim to scale up. It opens up potential for the development of new instruments and dynamic instruments that adapt to the answers that a user gives. The potential of using digital technology in healthcare settings is quite extraordinary but you can't do any of that without first having assessments that work robustly and well."

Reliability and responsiveness were used as factors to determine whether their apps were measuring in the way that they should be. Reliability refers to the result of the measure not changing when nothing has changed, while responsiveness refers to a change in the result when a measurable factor has changed.

The researchers divided participants in the study into groups depending on whether they had recorded a change in their pain. People who had received treatment for their condition and improved tested the responsiveness of the apps. Those with chronic pain, and less likely to improve, tested the apps for reliability.

Digital tests have a number of advantages over paper-based versions, including their low cost, lower carbon footprint, better information security and improving the participant's experience.

Earlier this month, a new report from the Royal College of Physicians, Outpatients: The future - Adding value through sustainability, called for greater use of already available technology in healthcare.

Credit: 
University of Warwick

The protein with the starting gun

Bacteria are capable of extremely rapid growth, but only when the conditions are right. If they lack nutrients, or if it is too cold or dry, they will enter a dormant state to wait it out. Until now, the question of how individual bacterial cells decide whether to divide has generally been studied using populations that are happily growing. But to date nobody has been able to say what it is that prompts a dormant bacterium to wake up and start dividing.

Now Uwe Sauer, Head of the Institute of Molecular Systems Biology at ETH Zurich, and his team of researchers have solved this mystery. They studied the gut bacterium E. coli to find out what drives a cell's decision to divide for the first time. Surprisingly, the answer lies with a single protein in the interior of the bacterial cell: only when the concentration of this protein rises above a certain threshold will the cell divide. Based on their findings, the researchers developed a mathematical model. "Now, for the first time, this model makes quantitative predictions of when cell division will commence," Sauer says. Their study was recently published in the journal Molecular Systems Biology.

Rapid activation of metabolism

To understand this mechanism, the researchers first caused the E. coli bacteria to become dormant by starving them of nutrients. Next, they fed the cells with tiny droplets of glucose solution, which the bacteria lost no time in devouring: within seconds, their metabolism sprang into action. To demonstrate this, the researchers used a method developed in Sauer's lab that allows simultaneous real-time measurement of hundreds of metabolic products.

Their analysis showed that it took the bacteria an astonishingly short time to turn the glucose they were fed into new biomass, including amino and nucleic acids of the kind that subsequently make up proteins and DNA - the prerequisites for forming new cells.

Feeding frequency is key

Bacteria that are already in a growth phase will continue to divide as long as sufficient new biomass is present. But it was a different story with dormant cells: giving them the nourishing glucose drops every ten minutes caused them to produce more and more biomass over time, but they still didn't divide. Only when the researchers reduced the time interval and fed the cells every four minutes did cell division occur, albeit after one hour. Speeding up the feeding rate to once every minute had the effect that cell division commenced almost immediately. "It wasn't the total amount of sugar that was decisive, but rather the feeding frequency," Sauer says.

This led the researchers to suspect that the cells were turning the glucose into a key protein, but that this protein was being broken down in the time between feeds. Only when glucose is provided at sufficiently frequent intervals is more of the protein produced than broken down, allowing the cell to divide. To test their hypothesis, the researchers combed the scientific literature for proteins that play a role in cell division and are broken down by the cell's own proteases. This is how they tracked down the protein FtsZ, which forms a ring during division that helps a cell split into two daughter cells.

Cell division protein FtsZ as a signal

Together with Professor Roman Stocker from the Institute of Environmental Engineering at ETH Zurich and Professor Suckjoon Jun from the University of California, San Diego, the researchers were able to demonstrate that FtsZ really is broken down in E. coli cells and that its concentration diminishes during periods of starvation. It turns out that this was indeed the key protein the researchers were looking for, because artificially accelerating the breakdown of FtsZ delayed the onset of cell division. In contrast, when the researchers used genetic methods to make the cells produce more FtsZ, those cells started dividing sooner. "This is how we showed that the concentration of FtsZ is the decisive signal for cells to begin dividing," Sauer says.

In his opinion, these new findings not only serve to further basic research, they might also form the basis for specific applications: FtsZ is present not just in E. coli but in almost all species of bacteria, including pathogens such as Mycobacterium tuberculosis. "If we want to prevent dormant bacteria from beginning to divide, then FtsZ is a good point of attack," Sauer says. For some years now, several laboratories have been conducting research into substances that accelerate the breakdown of FtsZ, which makes them promising candidates for new antibiotics.

Credit: 
ETH Zurich