Culture

Do bats adapt to gates at abandoned mines?

Abandoned mines can serve as roost sites for bats, but because the mines pose serious risks to humans, officials often install gates at their entrances. With more than 80,000 abandoned mines in the southwestern United States, these subterranean habitats are important to bat survival as human disturbances from recreation and other activities at natural caves are affecting their use by bats.

A new Journal of Wildlife Management study found that most gates installed today do not impede usage of the site, with bats acclimating over time after gates are placed. The new findings are important because prior to the study, biologists knew little about the effect of gates on bat behavior.

Certain factors were more important than gate design in predicting the presence of some bat species, including elevation, portal area, number of mine levels and entrances. Although the researchers saw no difference in bats' responses to gate height or material, less maneuverable bat species initially collided and landed more frequently on gates than did agile species.

The findings will inform management on closure methods at caves and abandoned mines in the United States and beyond.

"Bats are often viewed negatively, but they are critical to our ecosystems," said lead author Dr. Carol Chambers, of Northern Arizona University. "Bats face many difficulties today, from white-nose syndrome to habitat loss. Our findings help protect these animals and keep humans safe."

Credit: 
Wiley

Walking is more efficient than thought for threatened polar bears

A polar bear plunges into the icy Arctic waters in search of firmer ice; its world, which was once a sea of white, is melting beneath its paws. 'Research has documented declines in polar bear populations in some regions of the Arctic', says Anthony Pagano from the US Geological Survey, explaining that the bears now have to roam further on the receding ice to locate the seals upon which they dine. And, to make their predicament worse, measurements in the 1970s and 1980s suggested that polar bears consume more energy than other similarly sized animals because they have to generate heat to remain warm in the frigid environment and walk long distances to catch food. Knowing how much energy polar bears use just to remain alive is essential if we are to understand how the animals will survive in their dwindling environment, so Pagano and colleague Terrie Williams from the University of California, Santa Cruz, USA, embarked on an ambitious programme of measuring how much energy polar and grizzly bears consume as they amble along. The scientists publish their discovery that polar bears and grizzly bears walk efficiently, consuming the same amount of energy while walking as other large animals, in Journal of Experimental Biology at http://jeb.biologists.org.

'Our conversations with zoos for this study started in 2012', says Pagano, recalling how he and Williams contacted Amy Cutting, Nicole Nicassio-Hiskey and Amy Hash at Oregon Zoo, Portland, USA, and Megan Owen, Tammy Batson and Nate Wagner at San Diego Zoo, USA, as both teams had trained polar bears to participate in husbandry procedures such as providing blood samples for health test. However, Pagano and Williams wanted to measure how much oxygen the 240kg animals consumed to calculate how much energy they were using while walking and the conventional method of placing a mask over the bear's muzzle would not work: 'Big carnivores do not like things on their faces', Williams explains. Instead, Charlie Robbins and Tony Carnahan from Washington State University, USA, built a custom-designed bear-proof metabolic chamber by installing a 3.6m long horse treadmill in a steel-framed chamber constructed from bullet-proof polycarbonate.

The team then transported the 2000kg structure to the polar bears' respective locations, where Nicassio-Hiskey and Hash (Portland) and Batson, Owen and Wagner (San Diego) spent months patiently training the animals to walk on the treadmill. Recalling this period, Pagano says, 'Finding foods that the polar bears would be highly motivated to walk for was challenging'. However, the grizzly bears at Washington State University, USA, were more eager: 'They just bowled right in; they did not care if the treadmill moved fast or slow, all they cared about were the training treats', laughs Williams. Once the bears were comfortable with walking in the metabolic chamber, the team begin measuring the animals' oxygen consumption while filming and recording their movements.

However, when they calculated the amount of energy consumed by the polar bears and grizzlies while sauntering at speeds of up to 4.6km/h, they were surprised that the two species consumed the same amount of energy (2.21kJ/kg m) and no more than similarly sized animals. The polar bears' walking metabolic rate was not intrinsically higher than that of other large mammals, but the team suspect that swimming could be more costly. And when they fitted GPS collars to six wild female polar bears on the Alaskan sea ice, it was clear that they were moving at similar speeds to the captive animals, ambling at around 3.4km/h and rarely breaking into a run, so their movements were as efficient. However, the news wasn't all good: simply standing up was more costly for both species than it is for other large animals, which could impact polar bears detrimentally as their survival teeters on thin ice.

Credit: 
The Company of Biologists

Say cheese! Why a toothy smile makes it easier for you to be identified

image: It is easier to match images of smiling faces.

Image: 
University of York

A fulsome smile in a photo makes it easier for people to identify the individual, say researchers at the University of York.

Previous research at York has shown that it is difficult for people to match a pair of unfamiliar faces in photographs, presenting significant issues for authorities to spot identity fraud. Research has also shown that new face morphing technology can not only deceive human eyes, but smartphone software as well.

To improve accuracy rates, research led by Dr Mila Mileva at the University's Department of Psychology, looked into the possibility that a smiling image of a person could be more easily matched to a different image of that same individual, as well as more easily distinguished from an image of a similar- looking identity.

Dr Mileva said: "Photo ID is a significant part of our lives and yet we know that the human brain has a hard time matching photos of people to other photos and matching photos with the real-life person. Identity fraud is a real problem on many levels, so it is important that we do more research in this area to see how we can improve methods of identification."

The team conducted three studies; one where 40 study participants were asked to match 60 unfamiliar images of neutral face expressions, much like a passport photo, with an open-mouth smiling face, and another where they compared neutral expressions with images of a closed smiling face.

They also asked 34 participants to match images where only the lower part of the face was visible. In all three studies, the researchers found that participants' success in matching two images of the same person was higher when an open mouth smile was displayed, as opposed to either a neutral image or a closed mouth smile.

Analysis of the results from study one found a 9% improvement in performance for smiling images when comparing two images of the same person and a 7% improvement when comparing images of two different, but similarly looking, people.

Dr Mileva said: "Our research suggests that replacing the neutral expression we usually use when taking identification photographs with an open mouth smile, can make face matching an easier decision.

"As soon as there's a mismatch in emotional expression - comparing a smiling and a neutral image for example - the matching accuracy drops substantially.

"We also had success in showing that an open mouth smile can help people to tell two similarly looking, but different people apart which is critical when checking photo identification."

Researchers say that an image of a smiling person at passport check points, for example, could see an improved identification success rate compared to the neutral image, but a more practical solution could be to embed a smiling picture in the chip, so that passport officials have access to both a neutral image and a smiling image to assist them in their decision-making.

The research is published in the British Journal of Psychology.

Credit: 
University of York

Strange 'nude' fossil creature from half a billion years ago

image: The new species of fossil chancelloriid: an enigmatic animal from the Cambrian Period with a tube-like body, 'minotaur-horn' spines, and doughnut-shaped scars.

Image: 
Derek Siveter/Tom Harvey/Peiyun Cong

Scientists have discovered the fossil of an unusual large-bodied 'nude' sea-creature from half a billion years ago.

The creature belongs to an obscure and mysterious group of animals known as the chancelloriids - and scientists are unclear about where they fit in the tree of life.

They represent a lineage of spiny tube-shaped animals that arose during the Cambrian evolutionary "explosion" but went extinct soon afterwards. In some ways they resemble sponges, a group of simple filter-feeding animals, but many scientists have dismissed the similarities as superficial.

The new discovery by a team of scientists from the University of Leicester, the University of Oxford and Yunnan University, China, adds new evidence that could help solve the mystery.

The researchers have published their findings in the Royal Society journal Proceedings of the Royal Society B. doi: 10.1098/rspb.2018.0296. The Leicester authors are Tom Harvey, Mark Williams, David Siveter & Sarah Gabbott.

The new species, named Allonnia nuda, was discovered in the Chengjiang deposits of Yunnan Province, China. It was surprisingly large in life (perhaps up to 50 cm or more) but had only a few very tiny spines. Its unusual "naked" appearance suggests that further specimens may be "hiding in plain sight" in fossil collections, and shows that this group was more diverse than previously thought.

Furthermore, the new species holds clues about the pattern of body growth, with clear links to modern sponges. It is too soon to say the mystery has been solved, but the discovery highlights the central role of sponge-like fossils in the debate over earliest animal evolution.

Dr Tom Harvey, from the University of Leicester's School of Geography, Geology and the Environment, explained: "Fossil chancelloriids were first described around 100 years ago, but have resisted attempts to place them in the tree of life. We argue that their pattern of body growth supports a link to sponges, reinvigorating an old hypothesis. We're not suggesting that it's "case closed" for chancelloriids, but we hope our results will inspire new research into the nature of the earliest animals."

Dr Peiyun Cong, from the Yunnan Key Laboratory for Palaeobiology, Kunming, China, and The Natural History Museum, UK, added: "The Chengjiang deposits of Yunnan Province continue to reveal surprising new fossils we could hardly have imagined. Together, they provide a crucial snapshot of life in the oceans during the Cambrian explosion."

Credit: 
University of Leicester

Living the high life: How altitude influences bone growth

High altitude is a particularly challenging environment - the terrain is physically challenging and the land has a relatively poor crop yield, so food can be sparse. Most importantly, oxygen levels are lower meaning that conversion of food into energy in an individual's body is not very efficient and leads to relatively limited energy available for growth.

In new research published today in the journal Royal Society Open Science, an international team of scientists examine how high altitude and the associated limited available energy affects the growth of long bones in Himalayan populations.

By measuring the limbs of people of similar ancestry from high altitude and low altitude regions, the team found that those living at high altitude had significantly shorter lower arm segments. However, compared to people living at low altitude, the length of the upper arm and hand were relatively the same.

According to lead author Stephanie Payne of the Department of Archaeology, University of Cambridge, "Our findings are really interesting as they show that the human body prioritises which segments to grow when there is limited energy available for growth, such as at high altitude. This comes at the expense of other segments, for example the lower arm. The body may prioritise full growth of the hand because it is essential for manual dexterity, whilst the length of the upper arm is particularly important for strength."

"We're really grateful to the Himalayan Sherpa populations who participated in the study. We measured and examined over 250 individuals and then compared our findings to genetically similar Tibetan groups living in the lowlands of Nepal."

"Our research actually mirrors evidence previously found in Andean populations. Our study demonstrates a similar pattern of growth prioritisation across the limb segments."

But, as Payne can attest to, there are many challenges associated with conducting research at high altitude. In addition to undertaking the so-called "most dangerous flight in the world" from Kathmandu to Lukla airport, a two day trek at c. 3500m above sea level left Payne, a National Geographic young explorer, suffering from altitude sickness.

"We used Namche Bazaar on the Everest Trail as our base to conduct the study with the local population. It was an incredible experience with yaks carrying our anthropometric equipment through the mountain passes. It is not an easy journey, but thanks to the dedicated team, including my research assistant Oliver Melvill, we were able to collect the valuable data and obtain a suitable sample size."

Whilst this pattern of differential limb segment growth is interesting, scientists are still uncertain of the biological mechanism behind it. Further research is required to determine whether it might be related to temperature changes down the limb during growth, altered blood flow down the limb, or differences in nutrient delivery between limb segments, or another unknown mechanism.

Credit: 
University of Cambridge

How vaping helps even hardened smokers quit

Vaping helps people stop smoking - even when they don't want to, according to new research from the University of East Anglia.

A study published today shows that smokers who switch to vaping may be better able to stay smoke-free in the long term.

And that even people who didn't want to stop smoking, have eventually quit because they found vaping more enjoyable.

Lead researcher Dr Caitlin Notley from UEA's Norwich Medical School said: "E-cigarettes are at least 95 per cent less harmful than tobacco smoking, and they are now the most popular aid to quitting smoking in the UK.

"However the idea of using e-cigarettes to stop smoking, and particularly long-term use, remains controversial.

"We wanted to find out about how people use e-cigarettes to quit smoking - and whether vaping supports long-term smoking abstinence."

The research team carried out in-depth interviews with 40 vapers. They asked them about their tobacco smoking history and prior quit attempts, and about how they started vaping, their vape set up, preferred flavours and strength, and whether they had switched to vaping in attempt to quit smoking.

They also asked them about situations and experiences that caused them to relapse into tobacco smoking.

"We found that vaping may support long-term smoking abstinence," said Dr Notley. "Not only does it substitute many of the physical, psychological, social and cultural elements of cigarette smoking, but it is pleasurable in its own right, as well as convenient and cheaper than smoking.

"Our study group also felt better in themselves - they noticed better respiratory function, taste and smell.

"But the really interesting thing we found was that vaping may also encourage people who don't even want to stop smoking, to eventually quit."

While most of the sample group reported long histories of tobacco smoking and multiple previous quit attempts, a minority (17 per cent) said they enjoyed smoking and had never seriously attempted to quit.

"These were our accidental quitters," said Dr Notley. "They hadn't intended to quit smoking and had tried vaping on a whim, or because they had been offered it by friends. They went on to like it, and only then saw it as a potential substitute for smoking."

"Many people talked about how they saw vaping was a no pressure approach to quitting," she added.

While most of the group switched quickly and completely from smoking to vaping, some found themselves using both cigarettes and vaping, and then sliding towards stopping smoking.

"We found that people did occasionally relapse with a cigarette, mainly due to social or emotional reasons, but it didn't necessarily lead to a full relapse.

"This study suggests that vaping is a viable long-term substitute for smoking, with substantial implications for tobacco harm reduction."

The study was funded by Cancer Research UK.

Alison Cox, director of cancer prevention at Cancer Research UK, said: "The evidence so far shows that e-cigarettes are far safer than tobacco.

E-cigarettes do still contain nicotine which is addictive, but it's not responsible for the major harms of smoking. This is why they have great potential as an aid to help people quit smoking for good.

"It's great to see this early indication that e-cigarettes could encourage smokers who weren't originally thinking of quitting to give up. But more research is needed to understand exactly how e-cigarettes are being used by people who don't want to stop smoking and how often this results in quitting.

E-cigarettes are just one option for quitting - your local Stop Smoking Service can give you free advice on the best method for you, and with their support you'll have the best chance of success."

Credit: 
University of East Anglia

Methadone and buprenorphine reduce risk of death after opioid overdose

nonfatal opioid overdose is associated with significant reductions in opioid related mortality. The research, published today in the Annals of Internal Medicine, was co-funded by the National Institute on Drug Abuse (NIDA) and the National Center for Advancing Translational Sciences, both parts of NIH.

Study authors analyzed data from 17,568 adults in Massachusetts who survived an opioid overdose between 2012 and 2014. Compared to those not receiving medication assisted treatment, opioid overdose deaths decreased by 59 percent for those receiving methadone and 38 percent for those receiving buprenorphine over the 12 month follow-up period. The authors were unable to draw conclusions about the impact of naltrexone due to small sample size, noting that further work is needed with larger samples. Buprenorphine, methadone, and naltrexone are three FDA-approved medications used to treat opioid use disorder (OUD).

The study, the first to look at the association between using medication to treat OUD and mortality among patients experiencing a nonfatal opioid overdose, confirms previous research on the role methadone and buprenorphine can play to effectively treat OUD and prevent future deaths from overdose.

Despite compelling evidence that medication assisted treatment can help many people recover from opioid addiction, these proven medications remain greatly underutilized. The study also found that in the first year following an overdose, less than one third of patients were provided any medication for OUD, including methadone (11 percent); buprenorphine (17 percent); and naltrexone (6 percent), with 5 percent receiving more than one medication.

In an editorial commenting on the study, Dr. Nora Volkow, director of NIDA, said, "A great part of the tragedy of this opioid crisis is that, unlike in previous such crises America has seen, we now possess effective treatment strategies that could address it and save many lives, yet tens of thousands of people die each year because they have not received these treatments. Ending the crisis will require changing policies to make these medications more accessible and educating primary care and emergency providers, among others, that opioid addiction is a medical illness that must be treated aggressively with the effective tools that are available." The editorial was co-authored by NIDA scientist Dr. Eric Wargo.

Another alarming study finding was that despite having had an opioid overdose, 34 percent of people who experienced an overdose were subsequently prescribed one or more prescriptions for opioid painkillers over the next 12 months, and 26 percent were prescribed benzodiazepines.

"Nonfatal opioid overdose is a missed opportunity to engage individuals at high risk of death," said Marc Larochelle, M.D., the study's lead investigator at Boston Medical Center's Grayken Center for Addiction and Boston University School of Medicine. "We need to better understand barriers to treatment access and implement policy and practice reforms to improve both engagement and retention in effective treatment."

The authors conclude that a nonfatal opioid overdose treated in the emergency department is a critical time to identify people with OUD, and an opportunity to offer patients access to treatment inventions, providing linkage to care following their discharge, and making improvements in treatment retention.

Credit: 
NIH/National Institute on Drug Abuse

Climate change to overtake land use as major threat to global biodiversity

Climate change will have a rapidly increasing effect on the structure of global ecological communities over the next few decades, with amphibians and reptiles being significantly more affected than birds and mammals, a new report by UCL finds.

The pace of change is set to outstrip loss to vertebrate communities caused by land use for agriculture and settlements, which is estimated to have already caused losses of over ten per cent of biodiversity from ecological communities.

Previous studies have suggested that ecosystem function is substantially impaired where more than 20 per cent of species are lost; this is estimated to have occurred across over a quarter of the world's surface, rising to nearly two thirds when roads are taken into account.

The new study, published today in Proceedings of the Royal Society B, shows that the effects of climate change on ecological communities are predicted to match or exceed land use in its effects on vertebrate community diversity by 2070.

The findings suggest that efforts to minimise human impact on global biodiversity should now take both land use and climate change into account instead of just focusing on one over the other, as the combined effects are expected to have significant negative effects on the global ecosystem.

Study author, Dr Tim Newbold (UCL Genetics, Evolution & Environment), said: "This is the first piece of research looking at the combined effects of future climate and land use change on local vertebrate biodiversity across the whole of the land surface, which is essential when considering how to minimise human impact on the local environment at a global scale.

"The results show how big a role climate change is set to play in decreasing levels of biodiversity in the next few decades and how certain animal groups and regions will be most affected."

Dr Newbold's research has found that vertebrate communities are expected to lose between a tenth and over a quarter of their species locally as a result of climate change.

Furthermore, when combined with land use, vertebrate community diversity is predicted to have decreased substantially by 2070, with species potentially declining by between 20 and nearly 40 per cent.

The effect of climate change varies around the world. Tropical rainforests, which have seen lower rates of conversion to human use than other areas, are likely to experience large losses as a result of climate change. Temperate regions, which have been the most affected by land use, stand to see relatively small biodiversity changes from future climate change, while tropical grasslands and savannahs are expected to see strong losses as a result of both climate change and land use.

Credit: 
University College London

Can evolution explain why the young are often more susceptible than adults to infection?

In many species, including humans, the young are often more susceptible to infection than adults, even after accounting for prior exposure to infection. From an evolutionary perspective this may seem puzzling, as dying young or becoming infertile due to infection means organisms will be unable to reproduce. However, new research from the University of Bath suggests that many species may have evolved to prioritise growth over immunity while maturing.

Understanding precisely how immunity varies with age in different species is complex. Humans, like other vertebrates, possess both innate and adaptive immune responses, but the adaptive component is only effective against infections following exposure. Since younger individuals are less likely to have prior exposure to many infections, they are expected to be more susceptible. Yet even after accounting for prior exposure, there is growing evidence that children are inherently more susceptible than adults to certain infections. Similarly, many animal and plant species which lack adaptive immune systems have also been found to be more susceptible during juvenile stages, suggesting this phenomenon is widespread in nature.

In a new study published in Proceedings of the Royal Society B, scientists from University of Bath and the University of Virginia use theoretical models to predict how and when juveniles evolve to be more susceptible than adults to infection. Crucially, the researchers study what happens if juveniles have to choose between using their limited resources for growth or to prevent infection.

Dr Ben Ashby, lead author on the paper and a research fellow funded by the Natural Environment Research Council (NERC) in Bath's Department of Mathematical Sciences, explains: "By temporarily diverting resources away from immunity during development, organisms are at greater risk of infection while young but can grow faster or larger, giving them an advantage during adulthood."

The models show that the extent to which juveniles evolve to be more susceptible than adults depends on both the life cycle of the host and the characteristics of the disease.

On the use of mathematical models, Dr Ashby said: "Studying simple mathematical models allows us to make general predictions about how juvenile susceptibility is likely to evolve in nature, telling us how factors such as lifespan and the length of the juvenile period affect the trade-off organisms may face between growth and immunity."

Indeed, the models predict that juvenile susceptibility should generally be lowest when organisms have lifespans that are neither too short, nor too long. If the lifespan of the host is too short, then it is difficult for the disease to spread and so hosts can risk being more susceptible during development. If hosts have long lifespans with relatively short juvenile stages, then the risk of increased susceptibility while developing is only incurred for a brief time and so juvenile susceptibility is again favoured.

In future, the team hopes to test their predictions by studying seedling resistance in plants. Dr Ashby added: "Many important crops have been artificially selected for seedling resistance, so we know that it is physiologically possible but often doesn't evolve in nature. It seems likely that this is because plants, like other hosts, have to balance resources during development between growth and immunity."

Credit: 
University of Bath

Children's immune system could hold the key to preventing sepsis

Scientists have identified the key response that children use to control infections

Children are naturally more resistant to lots of infectious diseases

Sepsis affects more than 20 million people worldwide and is responsible for more deaths in the UK than bowel, breast and prostate cancer combined

Children's immune systems could hold the key to preventing life-threatening infections and sepsis, a new study has revealed.

The ground-breaking research conducted by an international team of scientists at the University of Sheffield and Harvard T.H. Chan School of Public Health, has identified the key response that children use to control infections - making them resilient to many severe infections and sepsis.

Sepsis, also known as blood poisoning, is the reaction to an infection which causes the body to attack its own organs and tissue.

It affects more than 20 million people worldwide and is responsible for more deaths in the UK than bowel, breast and prostate cancer combined. It is often referred to as the hidden killer as symptoms initially present themselves as flu, gastroenteritis or a chest infection.

The new study, which is the first of its kind, has helped scientists identify key differences in cell-pathway activity in the blood of septic adults and children. Establishing the pathways that help prevent sepsis is a powerful new way to discover drugs for intervention against sepsis and provides direct insight into potential cures for the disease.

Winston Hide, Professor of Computational Biology at the University of Sheffield's Institute of Translational Neuroscience (SITraN), is an author of the pioneering study published in the journal Molecular Systems Biology.

"Children are naturally more resistant to lots of infectious diseases," said Professor Hide.

"During outbreaks like Spanish flu and Ebola we know that children survived much better than adults. By analysing the blood profiles of infected children and comparing them to adults with sepsis we were able to identify children whose natural resilience helped them to ward off infection.

"By using the lessons we have learnt from the immune systems of children, scientists can now unlock how to control the disease and prevent it from occurring as opposed to trying to fight the disease once it has manifested itself."

The findings of the study are now being used to design drugs for research into prevention of other pathological diseases including Alzheimer's.

Credit: 
University of Sheffield

D for danger! Speech sounds convey emotions

image: This is Zachary Estes in his office at Bocconi University, Milan.

Image: 
Paolo Tonato

Individual speech sounds - phonemes - are statistically associated with negative or positive emotions in several languages, new research published in the journal Cognition by Bocconi Professor Zachary Estes, his Warwick colleague James Adelman and Bocconi student Martina Cossu shows. These associations help us quickly avoid dangers, because the phoneme-emotion associations are strongest at the beginning of the word and the phonemes that are spoken fastest tend to have a negative association.

It has long been known that phonemes systematically convey a range of physical properties such as size and shape. For example, the 'e' sound in Beetle sounds small, whereas the 'u' sound in Hummer sounds big. This is known as sound symbolism.

Given the evolutionary importance of avoiding dangers and approaching rewards, Estes and colleagues hypothesized that, like size and shape, emotion should also have sound symbolic associations. They tested this prediction in five languages - English, Spanish, Dutch, German and Polish - and in all five languages particular phonemes did indeed occur more often in positive or negative words.

Estes and colleagues also tested whether this emotional sound symbolism could be an adaptation for survival. To aid survival, communication about opportunities and especially dangers needs to be fast. The researchers tested this assumption in two ways.

First, they showed that in all five languages the phoneme-emotion associations are stronger at the beginnings of words than at the middle or ends of words. This allows emotion to be understood fast, even before the whole word is spoken.

Second, they examined the speed with which specific phonemes can be spoken. Estes and colleagues discovered that phonemes that can be spoken faster are more common in negative words. This allows dangers to be understood faster than opportunities, and this aids survival because avoiding dangers is more urgent than winning rewards. For instance, being too slow to avoid a snake can be fatal, but if you're too slow to catch a bird, you will probably have other chances.

Estes and his colleagues argue that emotional sound symbolism evolved due to its adaptive value to humans: it made communication about dangers and opportunities more efficient, allowing a quicker reaction to vital objects and thereby supporting the fitness and survival of the human species.

First author James Adelman said "In debates about whether human language abilities evolved from more general cognitive skills or more specific communicative adaptations, these findings reveal one specific adaptation. Our findings suggest that the ability to appreciate very short speech sounds could have helped humans to efficiently warn kin and peers, aiding survival."

Zachary Estes added "We have also begun testing applications in business, because emotional phonemes provide an opportunity for companies to inform consumers about their products. For example, a pharmaceutical company might want to use positive sounds for a drug that promotes health benefits like a vitamin, but they might want to use negative sounds for a drug the prevents health detriments like an anti-malarial drug."

Credit: 
Bocconi University

Opioid dependence in patients with degenerative spondylolisthesis: More likely to occur before than after surgery

image: This is a flowchart showing changes in opioid dependence among patients who underwent surgery for degenerative spondylolisthesis.

Image: 
(c) American Association of Neurological Surgeons.

CHARLOTTESVILLE, VA (JUNE 19, 2018). Researchers investigated risk factors for the development of opioid dependence in patients undergoing surgery for degenerative spondylolisthesis (DS). They found that, overall, patients were more likely to have a dependency on opioid medications before surgery than afterward. This finding and more appear in a new article published today in the Journal of Neurosurgery: Spine: "Factors predicting opioid dependence in patients undergoing surgery for degenerative spondylolisthesis: analysis from the MarketScan databases" by Mayur Sharma, MD, MCh, and colleagues.

A cursory glance at headlines from news sources confirms the fact that the United States is in the midst of an opioid epidemic. And it can be deadly. In 2015, opioid overdose was cited as a cause of more than 33,000 deaths. A large proportion of opioid addiction can be traced back to the misuse of physician-prescribed medications initially provided for the management of acute or chronic pain.

A common site of pain is in the lower back. Approximately 80% of adults experience low back pain at some time during their lives. In fact, low back pain has been cited as the single leading cause of disability.

In this article, the authors set out to identify what effect on opioid dependence surgery may have when used to treat patients with degenerative spondylolisthesis (DS), the forward slippage of a vertebra onto the vertebra beneath it. DS usually occurs in the lumbar spine and is due to a weakness in bones, joints, and ligaments that accompanies the aging process. Symptoms include pain in the lower back and legs, leg fatigue, muscle spasms, and irregular gait. Most of the time DS can be treated without surgery; however, surgery is indicated if there is progressive neurological damage or the patient's pain is disabling and does not respond adequately to nonsurgical treatment.

For their analysis, the authors defined indicators of opioid dependence as follows: continued
opioid use, more than 10 opioid prescriptions, or either a diagnosis of opioid dependence disorder or a prescription for treating opioid dependence disorder during the period of 1 year before or 3 to 15 months after surgery.

The authors extracted de-identified data from the MarketScan databases on 10,708 patients who had undergone surgery for DS. The median age of these patients was 61 years (interquartile range: 54 to 69 years). Sixty-five percent of the patients were women. In most cases (94%), the surgery was decompression with fusion, and in 76% of patients surgery involved multiple vertebrae. Many patients (54%) had one or more comorbidities. The majority of patients had commercial health insurance (61% as opposed to 35% with Medicare).

The authors were particularly interested in evaluating opioid dependence after surgery for DS, but they did examine preoperative opioid dependence to identify new cases of dependency. The authors identified a dependency on opioid medications in 15% (1,591) of the patients with DS before surgery. Between 3 and 15 months after surgery, however, the percentage of patients with a dependency on opioids was 10% (1,060).

After evaluating the impact of surgery, patient age and sex, comorbidities, and type of medical insurance held by the patients, the authors determined the following applied to patients who underwent surgery to treat DS:

There was an association between surgical decompression with fusion and a decreased risk of postoperative opioid dependence. In this study, the opioid dependence was reduced by 5% after surgery for DS.

Preoperative opioid dependence was associated with an increased risk of postoperative opioid dependence.

Increased patient age was associated with a decreased risk of postoperative opioid dependence.

Following surgery for DS, these patients were twice as likely to become opioid independent than they were to become opioid dependent.

When asked to summarize the findings of the study. Dr. Sharma said, "Decompression and fusion for DS is associated with reduced risk of opioid dependency."

Credit: 
Journal of Neurosurgery Publishing Group

Low vitamin D levels associated with scarring lung disease

Reviewing medical information gathered on more than 6,000 adults over a 10-year period, Johns Hopkins researchers have found that lower than normal blood levels of vitamin D were linked to increased risk of early signs of interstitial lung disease (ILD).

Interstitial lung disease is a relatively rare group of disorders characterized by lung scarring and inflammation that may lead to progressive, disabling and irreversible lung damage. An estimated 200,000 cases a year are diagnosed in the United States, most of them caused by environmental toxins such as asbestos or coal dust, but it can be caused by autoimmune disorders, infections, medication side effects or, sometimes, from unknown causes. Once diagnosed with the disease, most people don't live longer than five years. In a series of studies, the researchers sought to learn about new, and potentially treatable, factors related to early signs of the disease seen by CT scans -- imaging abnormalities that may be present long before symptoms develop -- which may help guide future preventive strategies.

Results of the most recent data analysis, published in the Journal of Nutrition on June 19, suggest that low vitamin D might be one factor involved in developing interstitial lung disease. Although the researchers caution their results can't prove a cause and effect, their data support the need for future studies to investigate whether treatment of vitamin D deficiency, such as with supplements or sunlight exposure, could potentially prevent or slow the progression of the disorder in those at risk. Currently, there is no proven treatment or cure once interstitial lung disease is established.

"We knew that the activated vitamin D hormone has anti-inflammatory properties and helps regulate the immune system, which goes awry in ILD," says
Erin Michos, M.D., M.H.S., associate professor of medicine at the Johns Hopkins University School of Medicine and associate director of preventive cardiology at the Johns Hopkins Ciccarone Center for the Prevention of Cardiovascular Disease. "There was also evidence in the literature that vitamin D plays a role in obstructive lung diseases such as asthma and COPD, and we now found that the association exists with this scarring form of lung disease too."

To search for that association, Michos and her research team used data from the Multi-Ethnic Study of Atherosclerosis (MESA), which from 2000 to 2002 recruited 6,814 people from Forsyth County, North Carolina; New York City; Baltimore, Maryland; St. Paul, Minnesota; Chicago, Illinois; and Los Angeles, California. The average age of participants was 62, and 53 percent were women. Thirty-eight percent of participants were white, 28 percent were African-American, 22 percent were Hispanic and 12 percent were Chinese.

At an initial clinical visit, staff took blood samples for each participant and measured, among other things, vitamin D levels. Those with vitamin D levels less than 20 nanograms per milliliter -- about 30 percent of participants -- were considered vitamin D deficient (2,051 people). Those with vitamin D levels of 20-30 nanograms per milliliter were considered to have "intermediate," although not optimal, levels of the nutrient, while those with 30 nanograms per milliliter or more were considered to have met recommended levels.

All participants underwent heart CT scans at the first visit and some also at later visits, offering incidental and partial views of the lungs.

At 10 years in, 2,668 participants had full lung CT scans evaluated by a radiologist for presence of scar tissue or other abnormalities.

The vitamin D-deficient participants had a larger volume, on average (about 2.7 centimeters cubed), of bright spots in the lung suggestive of damaged lung tissue, compared with those with adequate vitamin D levels. These differences were seen after adjusting for age and lifestyle risk factors of lung disease including current smoking status, pack years of smoking, physical inactivity or obesity.

When looking at the data from the full lung scans, the researchers said those with deficient or intermediate vitamin D levels were also 50 to 60 percent more likely to have abnormalities on their full lung scans suggestive of early signs of interstitial lung disease, compared with those with optimal vitamin D levels.

These associations were still seen after additionally adjusting for other cardiovascular and inflammatory risk factors, such as high blood pressure, high cholesterol, diabetes and levels of high-sensitivity C-reactive protein (another inflammatory marker).

"Our study suggests that adequate levels of vitamin D may be important for lung health. We might now consider adding vitamin D deficiency to the list of factors involved in disease processes, along with the known ILD risk factors such as environmental toxins and smoking," says Michos. "However, more research is needed to determine whether optimizing blood vitamin D levels can prevent or slow progression of this lung disease."

People can boost their vitamin D levels by spending 15 minutes a day in summer sunlight or through a diet that includes fatty fish and fortified dairy products. Supplements may be considered for some people with significant deficiency.

According to the 2013 Global Burden of Disease study, about 595,000 people worldwide develop interstitial lung disease each year, and about 491,000 die each year from it.

Credit: 
Johns Hopkins Medicine

Everything big data claims to know about you could be wrong

When it comes to understanding what makes people tick -- and get sick -- medical science has long assumed that the bigger the sample of human subjects, the better. But new research led by the University of California, Berkeley, suggests this big-data approach may be wildly off the mark.

That's largely because emotions, behavior and physiology vary markedly from one person to the next and one moment to the next. So averaging out data collected from a large group of human subjects at a given instant offers only a snapshot, and a fuzzy one at that, researchers said.

The findings, published this week in the Proceedings of the National Academy of Sciences journal, have implications for everything from mining social media data to customizing health therapies, and could change the way researchers and clinicians analyze, diagnose and treat mental and physical disorders.

"If you want to know what individuals feel or how they become sick, you have to conduct research on individuals, not on groups," said study lead author Aaron Fisher, an assistant professor of psychology at UC Berkeley. "Diseases, mental disorders, emotions, and behaviors are expressed within individual people, over time. A snapshot of many people at one moment in time can't capture these phenomena."

Moreover, the consequences of continuing to rely on group data in the medical, social and behavioral sciences include misdiagnoses, prescribing the wrong treatments and generally perpetuating scientific theory and experimentation that is not properly calibrated to the differences between individuals, Fisher said.

That said, a fix is within reach: "People shouldn't necessarily lose faith in medical or social science," he said. "Instead, they should see the potential to conduct scientific studies as a part of routine care. This is how we can truly personalize medicine."

Plus, he noted, "modern technologies allow us to collect many observations per person relatively easily, and modern computing makes the analysis of these data possible in ways that were not possible in the past."

Fisher and fellow researchers at Drexel University in Philadelphia and the University of Groningen in the Netherlands used statistical models to compare data collected on hundreds of people, including healthy individuals and those with disorders ranging from depression and anxiety to post-traumatic stress disorder and panic disorder.

In six separate studies they analyzed data via online and smartphone self-report surveys, as well as electrocardiogram tests to measure heart rates. The results consistently showed that what's true for the group is not necessarily true for the individual.

For example, a group analysis of people with depression found that they worry a great deal. But when the same analysis was applied to each individual in that group, researchers discovered wide variations that ranged from zero worrying to agonizing well above the group average.

Moreover, in looking at the correlation between fear and avoidance - a common association in group research - they found that for many individuals, fear did not cause them to avoid certain activities, or vice versa.

"Fisher's findings clearly imply that capturing a person's own processes as they fluctuate over time may get us far closer to individualized treatment," said UC Berkeley psychologist Stephen Hinshaw, an expert in psychopathology and faculty member of the department's clinical science program.

Credit: 
University of California - Berkeley

Scientists isolate protein data from the tiniest of caches -- single human cells

video: Scientists have detected a stockpile of information about proteins within single cells thanks to a new technology developed at Pacific Northwest National Laboratory. The nanoPOTS technology is an automated platform for capturing, shunting, testing and measuring tiny amounts of fluid. Keys to the technology include a robot that dispenses the fluid to a location with an accuracy of one millionth of a meter, moving between tiny wells that minimize the amount of surface area onto which proteins might glom. The team detected and measured more than 650 proteins in samples of fluid less than one-ten-thousandth of a teaspoon.

Image: 
PNNL

Scientists have obtained a slew of key information about proteins, the molecular workhorses of all cells, from single human cells for the first time.

The stockpile of information about proteins - the most such data ever collected from a single mammalian cell - gives scientists one of their clearest looks yet at the molecular happenings inside a human cell. Such data can reveal whether a cell is a rogue cancer cell, a malfunctioning pancreatic cell involved in diabetes, or a molecular player important for a preemie's survival.

These events and many more are determined by the actions of proteins in cells. Until now, detailed information on proteins inside single cells was hard to come by. The raw "data" - the amount of each protein - in a cell is extraordinarily scant and hard to measure. That's largely because scientists can't amplify proteins the way they can genes or other molecular messengers.

Now, in a study published in Angewandte Chemie, scientists from the Department of Energy's Pacific Northwest National Laboratory, working with counterparts at the University of Rochester Medical Center, show how they were able to learn an unprecedented amount of information about the proteins within samples of single human lung cells.

The scientists analyzed single cells, first from cultured cells and then from the lungs of a human donor, and detected on average more than 650 proteins in each cell - many times more than conventional techniques capture from single cells.

The team, including analytical chemists Ying Zhu and Ryan Kelly and biochemists Geremy Clair and Charles Ansong, made the findings thanks to a technology created at EMSL, the Environmental Molecular Sciences Laboratory, a DOE Office of Science user facility located at PNNL. The team developed the technology, called nanoPOTS, to measure proteins in a tiny, almost unimaginable amount of material.

"NanoPOTS is like a molecular microscope that allows us to analyze samples that are 500 times smaller than we could see before," said Kelly, the corresponding author of the paper. "We can identify more proteins in one cell than could previously be identified from a group of hundreds of cells."

That's important for a couple of reasons. Some proteins exert immense influence within a cell, perhaps determining whether the cell will live, die, mutate or travel to another part of the body, even when they are at very low levels that are undetectable using today's methods.

In addition, conventional technologies typically analyze hundreds or thousands of cells, pooling them into one batch for analysis. Those findings represent an average view of what's happening in that tissue; there is little insight to what's actually happening in a specific cell. That's a problem if there's variability from cell to cell - if some cells are behaving normally while other cells are cancerous, for instance.

In the current study, the team analyzed the proteins in a sample of fluid that is less than one-ten-thousandth of a teaspoon. Within that sample, the proteins amounted to just .15 nanograms - more than ten million times smaller than the weight of a typical mosquito.

Once scientists have their hands on such a valuable commodity - the innards of a single human cell - they put it through a battery of processing steps to prepare for analysis. But working with such a tiny sample has posed significant roadblocks to single-cell analysis. As the material is transferred from one test tube to another, from machine to machine, some of the sample is lost at every stage. When the original sample amounts to no more than a microscopic droplet, losing even a tiny bit of the sample is catastrophic.

Zhu and Kelly developed nanoPOTS, which stands for nanodroplet Processing in One pot for Trace Samples, to address this problem of sample loss. The technology is an automated platform for capturing, shunting, testing and measuring tiny amounts of fluid. Keys to the technology include a robot that dispenses the fluid to a location with an accuracy of one millionth of a meter, moving between tiny wells that minimize the amount of surface area onto which proteins might glom.

Within those tiny wells, scientists run several steps to isolate the proteins from the rest of the sample. Then, the material is fed into a mass spectrometer which separates out and measures each of hundreds of proteins.

All told, the technology reduces sample losses by more than 99 percent compared to other technologies, giving scientists enough of the scant material to make meaningful measurements - to tell which proteins are at high levels and which are at low levels. That's vital information when comparing, for example, brain cells from a person with Alzheimer's disease to those from a person not affected, or looking at cells that are cancerous compared to nearby cells that are healthy.

Credit: 
DOE/Pacific Northwest National Laboratory