Culture

UNM study confirms cannabis flower is an effective mid-level analgesic medication for pain

image: UNM researchers found further evidence that cannabis can significantly alleviate pain with the average user experiencing a three-point drop in pain suffering on a 0-10 point scale.

Image: 
Esteban Lopez

Using the largest database of real-time recordings of the effects of common and commercially available cannabis products in the United States (U.S.), researchers at The University of New Mexico (UNM) found strong evidence that cannabis can significantly alleviate pain, with the average user experiencing a three-point drop in pain suffering on a 0-10 point scale immediately following cannabis consumption.

With a mounting opioid epidemic at full force and relatively few alternative pain medications available to the general public, scientists found conclusive support that cannabis is very effective at reducing pain caused by different types of health conditions, with relatively minimal negative side effects.

Chronic pain afflicts more than 20 percent of adults and is the most financially burdensome health condition that the U.S faces; exceeding, for example, the combined costs of treating heart disease and cancer.

"Our country has been flooded with an over-prescription of opioids medications, which then often leads to non-prescription opioid and heroin use for many people. This man-made disaster is killing our families and friends, regardless of socio-economic status, skin tone, and other superficial human differences" said Jacob Miguel Vigil, one of the lead investigators of the study, titled "The Effectiveness of Self-Directed Medical Cannabis Treatment for Pain", published in the journal Complementary Therapies in Medicine.

Vigil explains, "Cannabis offers the average patient an effective alternative to using opioids for general use in the treatment of pain with very minimal negative side effects for most people."

The researchers relied on information collected with Releaf App, a mobile software program developed by co-authors Franco Brockelman, Keenan Keeling and Branden Hall. The app. enables cannabis users to monitor the real-time effects of the breadth of available cannabis-based products, which are always variable, of course, given the complexity of the Cannabis plant from which these products are obtained.

Since its release in 2016, the commercially developed Releaf App has been the only publicly available, incentive-free app for educating patients on how different types of products (e.g., flower or concentrate), combustion methods, cannabis subspecies (Indica, Sativa, and hybrid), and major cannabinoid contents (THC and CBD) affect their symptom severity levels, providing the user invaluable feedback on their health status, medication choices, and the clinical outcomes of those choices as measured by symptom relief and side effects.

Scientifically, software like the Releaf App enables researchers to overcome the inherent limitations of government-funded clinical trials on the real-time effects of Cannabis, which are rare in general, but also often limited by onerous federal regulations, including its Schedule I status (no accepted medical use and a high abuse potential) and the mandate that investigators use the notoriously poor quality and low potency cannabis products supplied by the National Institute of Drug Abuse.

"Even just rescheduling cannabis just from Schedule I to Schedule II, i.e., classifying it with fentanyl, oxycodone, and cocaine rather than heroin and ecstasy, could dramatically improve our ability to conduct research and only would require that the DEA recognizes that accepted medical uses for cannabis exist, as clearly evidenced by our results and the flourishing medical cannabis programs in the majority of U.S. states," pointed out co-author Sarah Stith.

Among the study's findings the greatest analgesic responses were reported by people that used whole dried cannabis flower, or 'buds,' and particularly cannabis with relatively high levels of tetrahydrocannabinol, otherwise known as THC. The more recently popularized cannabinoid, cannabidiol or CBD, in contrast, showed little association with the momentary changes in pain intensity, based on the massive database explored in the study.

"Cannabis likely has numerous constituents that possess analgesic properties beyond THC, including terpenes and flavonoids, which likely act synergistically for people that use whole dried cannabis flower," said Vigil, "Our results confirm that cannabis use is a relatively safe and effective medication for alleviating pain, and that is the most important message to learn from our results. It can only benefit the public for people to be able to responsibly weigh the true risks and benefits of their pain medication choices, and when given this opportunity, I've seen numerous chronic pain patients substitute away from opioid use, among many other classes of medications, in favor of medical cannabis."

"Perhaps the most surprising result is just how widespread relief was with symptom relief reported in about 95 percent of cannabis administration sessions and across a wide variety of different types of pain," added lead author of the study, Xiaoxue Li.

The authors do caution that cannabis use does carry the risks of addiction and short-term impairments in cognitive and behavioral functioning, and may not be effective for everyone. However, there are multiple mechanisms by which cannabis alleviates pain suffering. In addition to its anti-inflammatory properties, cannabis activates receptors that are colocalized with opioid receptors in the brain. "Cannabis with high THC also causes mood elevation and adjusts attentional demands, likely distracting patients from the aversive sensations that people refer to "pain," explains Vigil.

"When compared to the negative health risks associated with opioid use, which currently takes the lives of over 115 Americans a day, cannabis may be an obvious value to patients. Chronic opioid use is associated with poorer quality of life, social isolation, lower immune functioning and early morbidity. In contrast, my own ongoing research increasingly suggests that cannabis use is associated with a reversal of each of these potential outcomes," said Vigil

Credit: 
University of New Mexico

Babbling babies' behavior changes parents' speech

ITHACA, N.Y. - New research shows baby babbling changes the way parents speak to their infants, suggesting that infants are shaping their own learning environments.

Researchers from Cornell University's Behavioral Analysis of Beginning Years (B.A.B.Y) Laboratory found that adults unconsciously modify their speech to include fewer unique words, shorter sentences, and more one-word replies when they are responding to a baby's babbling, but not when they are simply speaking to a baby.

"Infants are actually shaping their own learning environments in ways that make learning easier to do," said Steven Elmlinger, lead author of "The Ecology of Prelinguistic Vocal Learning: Parents Simplify the Structure of Their Speech in Response to Babbling." "We know that parents' speech influences how infants learn - that makes sense - and that infants' own motivations also change how they learn. But what hasn't been studied is the link between how infants can change the parents, or just change the learning environment as a whole. That's what we're trying to do."

In the study, 30 mother-infant pairs went to the lab's play space for 30-minute sessions on two consecutive days. The 9- and 10-month-old babies could roam freely around the environment, which was filled with toys, a toy box and animal posters. The babies wore overalls with hidden wireless microphones to record their speech, and were also videotaped by three remote-controlled digital video cameras.

Researchers measured parents' vocabulary and syntax, and calculated the change in babies' vocal maturity from the first to the second day. They found that babies whose mothers provided more learning opportunities - by using simplified speech with fewer unique words and shorter utterances - were faster learners of new speech sounds on the second day.

The research contributes to a growing body of work that demonstrates the important role infants play in shaping their own language learning environment. Interventions to improve at-risk children's learning should encourage people to be responsive to their baby's babbling, said senior author Michael Goldstein, associate professor of psychology.

"It's not meaningless," he said. "Babbling is a social catalyst for babies to get information from the adults around them."

Credit: 
Cornell University

Physicists create world's smallest engine

image: The world's smallest engine works due to its intrinsic spin, which converts heat absorbed from laser beams into oscillations, or vibrations, of the trapped ion.

Image: 
Professor Goold, Trinity College Dublin.

Theoretical physicists at Trinity College Dublin are among an international collaboration that has built the world's smallest engine - which, as a single calcium ion, is approximately ten billion times smaller than a car engine.

Work performed by Professor John Goold's QuSys group in Trinity's School of Physics describes the science behind this tiny motor. The research, published today in international journal Physical Review Letters, explains how random fluctuations affect the operation of microscopic machines. In the future, such devices could be incorporated into other technologies in order to recycle waste heat and thus improve energy efficiency.

The engine itself - a single calcium ion - is electrically charged, which makes it easy to trap using electric fields. The working substance of the engine is the ion's "intrinsic spin" (its angular momentum). This spin is used to convert heat absorbed from laser beams into oscillations, or vibrations, of the trapped ion.

These vibrations act like a "flywheel", which captures the useful energy generated by the engine. This energy is stored in discrete units called "quanta", as predicted by quantum mechanics.

"The flywheel allows us to actually measure the power output of an atomic-scale motor, resolving single quanta of energy, for the first time," said Dr Mark Mitchison of the QuSys group at Trinity, and one of the article's co-authors.

Starting the flywheel from rest -- or, more precisely, from its "ground state" (the lowest energy in quantum physics) -- the team observed the little engine forcing the flywheel to run faster and faster. Crucially, the state of the ion was accessible in the experiment, allowing the physicists to precisely assess the energy deposition process.

Assistant Professor in Physics at Trinity, John Goold said: "This experiment and theory ushers in a new era for the investigation of the energetics of technologies based on quantum theory, which is a topic at the core of our group's research. Heat management at the nanoscale is one of the fundamental bottlenecks for faster and more efficient computing. Understanding how thermodynamics can be applied in such microscopic settings is of paramount importance for future technologies."

Credit: 
Trinity College Dublin

Improved functional near infrared spectroscopy enables enhanced brain imaging

image: a) High Density Diffuse Optical Tomography system with 158 sources (red) and 166 detectors (cyan), the surface of the brain is shown in pink and blue. b) Histogram of brain surface depth across 24 subjects with c) Brain surface depth probability distribution for each subject.
The red stars mark the maximum probability for each subject while the red line is the most probable depth across all subjects (12.5 mm), highlighting the need of improving both image quality and spatial resolution.

Image: 
The authors.

BELLINGHAM, Washington, USA and CARDIFF, UK - In an article published today in the peer-reviewed, open-access SPIE publication Neurophotonics, "High density functional diffuse optical tomography based on frequency domain measurements improves image quality and spatial resolution," researchers demonstrate critical improvements to functional Near Infrared Spectroscopy (fNIRS)-based optical imaging in the brain.

fNIRS-based optical imaging is non-invasive and relatively inexpensive technology used where neuroimaging is required, applied in areas such as functional brain mapping, psychology studies, intensive care unit patient monitoring, mental disease monitoring, and early dementia diagnosis. Applying amplitude modulated light known as frequency domain (FD) — rather than the usual continuous wave NIR light — to obtain overlapping measurements of tissue via high-density diffuse optical tomography, the researchers were able to achieve higher resolution in their imaging while also enabling sensitivity to deeper brain regions. FD-NIRS has already been used in such areas as breast-lesion optical tomography, brain-trauma assessment and joint imaging. This is the first example of researchers comparing the FD-NIR application against CW in functional brain imaging.

According to Neurophotonics Associate Editor Rickson Mesquita, of the University of Campesinas' Institute of Physics in Sao Paolo, Brazil, the findings mark exciting new possibilities in the fNIRS arena: "I believe this manuscript can be significant from the methods perspective since it addresses important validation about DOT with frequency-domain (FD) data. Importantly, their simulations and data appear to show that, by adding the phase information from the FD data, the depth sensitivity is greatly improved. The results are carefully addressed, and the authors' conclusions are of great interest to the fNIRS community."

The article authors are Matthaios Doulgerakis and Hamid Dehghani of the University of Birmingham's School of Computer Science in Birmingham, UK, and Adam T. Eggebrecht of the Mallinckrodt Institute of Radiology at the Washington University School of Medicine in St. Louis, Missouri, USA.

Neurophotonics, an open-access journal, is published in print and digitally by SPIE in the SPIE Digital Library. Its editor-in-chief is Boston University Neurophotonics Center Director David Boas. The SPIE Digital Library contains more than 500,000 publications from SPIE journals, proceedings, and books, with approximately 18,000 new research papers added each year.

About SPIE

SPIE is the international society for optics and photonics, an educational not-for-profit organization founded in 1955 to advance light-based science, engineering, and technology. The Society serves 257,000 constituents from 173 countries, offering conferences and their published proceedings, continuing education, books, journals, and the SPIE Digital Library. In 2018, SPIE provided more than $4 million in community support including scholarships and awards, outreach and advocacy programs, travel grants, public policy, and educational resources. www.spie.org.

Contact:Daneet SteffensPublic Relations Managerdaneets@spie.org+1 360 685 5478@SPIEtweets

Journal

Neurophotonics

DOI

10.1117/1.NPh.6.3.035007

Credit: 
SPIE--International Society for Optics and Photonics

Foodborne pathogen sheltered by harmless bacteria that support biofilm formation

image: The research was done in collaboration with the apple industry, in an effort to better understand the microbial ecology of food-processing facilities. The ultimate goal is to identify ways to improve pathogen control in the apple supply chain to avoid foodborne disease outbreaks and recalls of apples and apple products.

Image: 
Brian Holsclaw

Pathogenic bacteria that stubbornly lurk in some apple-packing facilities may be sheltered and protected by harmless bacteria that are known for their ability to form biofilms, according to Penn State researchers, who suggest the discovery could lead to development of alternative foodborne-pathogen-control strategies.

That was the key finding that emerged from a study of three tree-fruit-packing facilities in the Northeast where contamination with Listeria monocytogenes was a concern. The research, done in collaboration with the apple industry, was an effort to better understand the microbial ecology of food-processing facilities. The ultimate goal is to identify ways to improve pathogen control in the apple supply chain to avoid foodborne disease outbreaks and recalls of apples and apple products.

"This work is part of Penn State's efforts to help producers comply with standards set forth in the federal Food Safety Modernization Act, often referred to as FSMA," said researcher Jasna Kovac, assistant professor of food science, College of Agricultural Sciences. "The Department of Food Science at Penn State, through research and extension activities, has an ongoing collaboration with the apple industry, led by Luke LaBorde, professor of food science."

In the study, researchers sought to understand the composition of microbiota in apple-packing environments and its association with the occurrence of the foodborne pathogen Listeria monocytogenes. Their testing revealed that a packing plant with a significantly higher Listeria monocytogenes occurrence was uniquely dominated by the bacterial family Pseudomonadaceae and the fungal family Dipodascaceae.

"As we investigated the properties of these microorganisms, we learned that they are known to be very good biofilm formers," said lead researcher Xiaoqing Tan, a recently graduated master's degree student in food science and a member of the Penn State Microbiome Center, housed in the Huck Institutes of the Life Sciences. "Based on our findings, we hypothesize that these harmless microorganisms are supporting the persistence of Listeria monocytogenes because they protect the harmful bacteria by enclosing them in biofilms. We are testing this hypothesis in a follow-up study."

Biofilms are a collection of microorganisms that attach to a surface and then secrete a slimy material that slows down the penetration of cleaners and sanitizers, Kovac explained. "If a pathogenic bacterium is enclosed in a biofilm formed by microbiota, it is more likely that cleaning and sanitizing procedures will be less effective," she said. "This is a novel perspective, and it may well explain how Listeria monocytogenes has persisted in food-processing plants despite repeated efforts to kill and remove it."

The findings of the research, published today (Aug. 21) in Microbiome, provide insight into the Listeria contamination problem and may lead to researchers and the apple industry getting closer to solving it, Kovac believes. Equipment in fruit-processing plants -- such as brush conveyors -- have a poor sanitary design that makes them difficult to clean and sanitize, she pointed out. She and LaBorde plan to work with the apple industry to devise more effective cleaning and sanitizing strategies.

"Following up on these findings, we are experimenting with some of the nonpathogenic strains of bacteria that are not harmful to humans to see whether they can be used as biocontrols," she said. "Once applied on the surfaces of the equipment in these environments, they may be able to outcompete and suppress Listeria, thus reducing food-safety risks and potential regulatory action. We are still exploring that approach in a controlled laboratory environment. If it proves to be feasible, we would like to test it in apple-packing and processing facilities."

The challenge presented by microbiota possibly sheltering Listeria monocytogenes is not limited to fruit-processing facilities or produce, Penn State researchers suspect. They will soon begin analyzing microbial communities in dairy-processing facilities to determine the microbial composition and ecology of these environments.

Credit: 
Penn State

Texas cities increasingly susceptible to large measles outbreaks

video: This simulation, created by the University of Pittsburgh Graduate School of Public Health, shows how a measles outbreak could spread in Austin, Texas, if vaccination rates drop by 10% and a single measles case is introduced.

Image: 
University of Pittsburgh Public Health Dynamics Laboratory FRED Measles

PITTSBURGH, Aug. 21, 2019 - The growing number of children arriving at Texas schools unvaccinated makes the state increasingly vulnerable to measles outbreaks in cities large and small, according to a computer simulation created by the University of Pittsburgh Graduate School of Public Health.

The findings, published today in the journal JAMA Network Open, indicate that an additional 5% decrease in vaccination rates, which have been on a downward trend since 2003, would increase the size of a potential measles outbreak by up to 4,000% in some communities.

"At current vaccination rates, there's a significant chance of an outbreak involving more than 400 people right now in some Texas cities," said lead author David Sinclair, Ph.D., a postdoctoral researcher in Pitt's Public Health Dynamics Laboratory. "We forecast that a continuous reduction in vaccination rates would exponentially increase possible outbreak sizes."

Measles is a highly contagious virus that can cause severe complications, including pneumonia, brain swelling and deafness. Approximately 1 out of every 1,000 children infected with measles will die from respiratory and neurologic complications. Measles is so contagious that, if no one was immunized, one infected person is likely to infect 12 to 16 others; in comparison, one person infected with the flu is expected to infect only one to two people. The measles vaccine -- which is often combined with the mumps and rubella vaccines and called the "MMR vaccine" -- is highly effective, conveying 97% immunity after two doses.

Sinclair and his team loaded real-world vaccination data for private schools and public school districts in Texas in the Framework for Reconstructing Epidemiological Dynamics (FRED) tool. FRED is an "agent-based" modeling system, which means it creates a synthetic population using U.S. Census data and then assigns the synthetic people to move about their communities from home to work or school as people do in the real world. This tool allows users to see, in silica, how a contagion could spread from person to person. In 2015, California legislators used FRED to help convince their peers to pass a bill restricting vaccine exemptions for school-age children.

The Texas Pediatric Society asked Pitt Public Health to model Texas in FRED to demonstrate the possibility of outbreaks in communities with low vaccination rates. Texas is the largest state by population that allows parents to opt their children out of required vaccines for religious or personal reasons. These exemptions have increased 28-fold in Texas, from 2,300 in 2013 to 64,000 in 2016. Austin is the current home of Andrew Wakefield, a discredited former British doctor who published falsified research linking vaccines to autism and who continues to espouse anti-vaccine messages.

In the FRED Measles Texas simulation, a single case of measles is introduced into various metropolitan areas through a randomly selected student whose parents have refused to vaccinate. The simulation runs for each city for 270 days -- the length of the typical school year -- at current vaccination rates and at a hypothetical decrease in those rates.

At current rates, the simulation estimates that measles outbreaks of more than 400 cases could occur in Austin and Dallas-Fort Worth. This is partly due to a minority of schools where vaccination rates are less than 92% -- low enough for measles to sustain transmission.

If the vaccination rate drops 5% in only the schools with populations that currently are undervaccinated, the size of potential measles outbreaks climbs exponentially in every metropolitan area, with Dallas-Fort Worth, Austin and Houston all susceptible to outbreaks of 500 to 1,000 people.

Approximately 64% of the simulated cases occur in children who were unvaccinated because they had a religious or personal exemption. But the model forecasts 36% of the cases would be in people who have a medical condition that prohibited vaccination, whose vaccine failed to build immunity or in unvaccinated adults, for whom the risk of complications is higher than children.

"When someone refuses to be vaccinated, they are making a decision that doesn't only impact them. They are increasing the risk that people who are not immune, through no fault of their own, will get very sick and possibly die," said senior author Mark Roberts, M.D., M.P.P., professor and chair of Pitt Public Health's Department of Health Policy and Management, and director of Pitt's Public Health Dynamics Laboratory.

Credit: 
University of Pittsburgh

Is it autism? The line is getting increasingly blurry

Around the world, the number of people diagnosed with autism is rising. In the United States, the prevalence of the disorder has grown from 0.05% in 1966 to more than 2% today. In Quebec, the reported prevalence is close to 2% and according to a paper issued by the province's public health department, the prevalence in Montérégie has increased by 24% annually since 2000.

However, Dr. Laurent Mottron, a professor at Université de Montréal's Department of Psychiatry and a psychiatrist at the Hôpital en santé mentale de Rivière-des-Prairies of the CIUSSS du Nord-de-l'Île-de-Montréal, has serious reservations about this data.

After studying meta-analyses of autism data, his research team found that the difference between people diagnosed with autism and the rest of the population is actually shrinking.

This study is published today [August 21] in JAMA Psychiatry, the most prestigious journal in the field of psychiatry. Given the importance of its findings, the study is also the subject of the journal's editorial.

Less differentiation observed

Dr. Mottron worked with intern Eya-Mist Rødgaard of the University of Copenhagen and four other researchers from France, Denmark and Montreal to review 11 meta-analyses published between 1966 and 2019, with data drawn from nearly 23,000 people with autism.

The meta-analyses showed that people with autism and people in the rest of the population exhibit significant differences in seven areas: emotion recognition, theory of mind (ability to understand that other people have their own intentions), cognitive flexibility (ability to transition from one task to another), activity planning, inhibition, evoked responses (the nervous system's response to sensory stimulation) and brain volume. Together, these measurements cover the basic psychological and neurological components of autism.

Dr. Mottron and his team looked at the "effect size" -- the size of the differences observed between people with autism and people without it -- and compared its progression over the years. This measurement is a statistical tool that quantifies the size of difference in a specific characteristic between two groups of subjects.

They found that, in each of the assessed areas, the measurable difference between people with autism and people without it has decreased over the past 50 years. In fact, a statistically significant dilution in effect size (ranging from 45% to 80%) was noted in five of these seven areas. The only two measurements that didn't show significant dilution were inhibition and cognitive flexibility.

"This means that, across all disciplines, the people with or without autism who are being included in studies are increasingly similar," said Mottron. "If this trend holds, the objective difference between people with autism and the general population will disappear in less than 10 years. The definition of autism may get too blurry to be meaningful--trivializing the condition--because we are increasingly applying the diagnosis to people whose differences from the general population are less pronounced."

To verify that the trend was unique to autism, the research team also analyzed data on similar areas from studies on schizophrenia. They found that the prevalence of schizophrenia has stayed the same and the difference between people with schizophrenia and those without it is increasing.

Changes in diagnostic practices

The diagnostic criteria for autism haven't changed over the years that the differences have diminished. Instead, Dr. Mottron believes that what has changed are diagnostic practices.

"Three of the criteria for an autism diagnosis are related to sociability," he said. "Fifty years ago, one sign of autism was a lack of apparent interest in others. Nowadays, it's simply having fewer friends than others. Interest in others can be measured in various ways, such as making eye contact. But shyness, not autism, can prevent some people from looking at others." 

To complicate matters, the term "autism" has fallen out of favour, replaced by "autism spectrum disorder," a sign that there's a new belief that there various different forms of the condition exist. This has prompted some people to question whether autism exists at all.

"And yet, autism is a distinct condition," says Dr. Mottron. "Our study shows that changes in diagnostic practices, which have led to a false increase in prevalence, are what's fuelling theories that autism doesn't really exist." 

Even though Dr. Mottron recognizes that there is a continuum between people with autism and those without it, he believes that such a continuum could result from the juxtaposition of natural categories. "Autism is a natural category at one end of the socialization continuum. And we need to focus on this extreme if want to make progress," he said.

In his opinion, autism studies include too many participants who aren't sufficiently different from people without autism. In contrast to the generally prevailing scientific belief, Dr. Mottron thinks that including more subjects in studies on autism, as it is currently defined, reduces the likelihood of discovering new things about the mechanisms of the disorder. No major discoveries have been made in this field in the last 10 years.

Credit: 
University of Montreal

Germany and United Kingdom are popular destinations

image: EU Migration Networks 2013: migration pathways between countries in the EU.

Image: 
M Windzio

Those who wish to leave their own country of origin in the European Union (EU) can currently do so without complications: with the right to freedom of movement, the EU offers its citizens unique conditions for migration. A study by the Universities of Göttingen, Bremen and Cologne has now shown that Germany and United Kingdom are the most popular destination countries for migration within Europe. Their research was published in the Journal of Ethnic and Migration Studies.

With sociologists from Bremen (Professor Michael Windzio) and Cologne (Sven Lenkewitz), Professor Céline Teney, Professor of Fundamentals of Social Sciences at the Institute of Sociology at the University of Göttingen, analysed the factors for internal migration in the European Union. Together they studied how the figures for immigration and emigration developed across the member states from 2001 to 2013. The network analysis showed that the richer countries in particular are attractive for European immigrants. In addition, the researchers found that there is a great deal of movement between neighbouring countries. New regulatory policies within the EU can also influence the flow of people: the opening of the labour market for citizens from the new member states has led to higher rates of immigration; while the EU accession of the newest member states has been accompanied by more emigration from these areas.

It remains to be seen how Brexit will affect the status of the United Kingdom and Germany as the most popular target countries. The study suggests that the UK will not easily lose its status as the most popular destination for European internal migration. "Our findings underline the push and pull factors for immigration and emigration within European member states. It has been shown that regulatory changes within the EU play a relatively secondary role in affecting migration," says Teney.

Credit: 
University of Göttingen

What factors influence how antibiotics are accessed and used in less well off countries

It is often assumed that people use antibiotics inappropriately because they don't understand enough about the spread of drug resistant superbugs.

A new study led by Warwick University Assistant Professor Marco J Haenssgen challenges this view. The study, published in the medical journal BMJ Open, reveals that basic understanding of drug resistance is in fact widespread in Southeast Asia but that higher levels of awareness are actually linked to higher antibiotic use in the general population.

The researchers conducted a large-scale survey among a representative sample of the rural population of 69 villages in northern Thailand and 65 villages in southern Lao PDR.

The survey found that:

people's awareness of drug resistance was similar to that of many industrialised countries - three in four villagers in Thailand and six in ten in Laos had heard about "drug resistance," although the term was usually interpreted as a change in the human body rather than as the evolution of bacteria to withstand antibiotic medicine.

people's attitudes in rural Thailand and Laos were often consistent with recommendations from the World Health Organization to not buy antibiotics without prescription. However, such attitudes were linked to disproportionately and potentially problematically high rates of prescribed antibiotics from public clinics and hospitals - up to 0.5 additional antibiotic courses per illness on average when controlling for other drivers of antibiotic use.

people who obtained antibiotics from informal sources, such as the village shop, were just as aware of drug resistance as people who relied on public healthcare channels.

patients receiving antibiotics from informal sources had no less wealth or formal education than users of public healthcare. Indeed, wealthier and more educated individuals in Chiang Rai were significantly associated with receiving antibiotics from informal sources, showing that it is not just people on low incomes who obtain antibiotics from informal sources.

Project leader Asst Prof Marco J Haenssgen interprets these results as a sign that the conventional public health model of behaviour change is failing: "Too many arguments in public health behaviour change rest on a model of 'information deficits.' This idea that people behave irrationally because they don't have the right information finds little support in our research."

"Basic awareness about drug resistance and antibiotics is widespread but does not contribute to better behaviour. New information can be empowering in principle, but people themselves decide how they will use this new 'power' in their daily lives. Unnecessary antibiotic use may then rather reflect privilege, resistance to patronising norms, or interference between local and Western ideas of what good care ought to be."

Thailand and Laos were selected for this study because of their traditionally high rates of antibiotic use and busy international travel patterns, which predispose these countries to the development and spread of drug resistance. The survey involved 2,141 adults from more than 130 villages who represent a rural population of 712,000 villagers in Thailand and Laos. Dr Haenssgen argues that the findings have a wider relevance, however.

"Ours is not an isolated case. Colleagues in China found for instance that more educated people were more likely to buy non-prescription medicine from unregistered stores, and the behavioural sciences have long established that information alone only accounts for a fraction of healthcare decisions. Public health has to catch up! To tackle the superbug crisis, we need to shift our attention to human decision-making processes and to people's behavioural responses to local contexts."

Credit: 
University of Warwick

New efficient method for urine analysis may tell us more

image: Using gadolinium (contrast agent used in MRI scans) may revolutionize the application of Nuclear Magnetic Resonance spectroscopy as a tool for more comprehensive and useful analysis of urine samples (Colourbox.)

Image: 
Colourbox

Human urine contains hundreds of small molecules that tell us about our health, diet and well-being. Associate Professor Frans Mulder, in collaboration with the University of Florence, has been successful in developing a new method for analyzing the components of a urine sample. Using this method, the analysis becomes both cheaper and more accurate.

The new approach is easy to use by other laboratories, and an important step for broader analysis of public health as well as personalized medicine. The research will be published in an article in the acclaimed journal, Angewandte Chemie, which has rated the results as "very important".

Old technique with new ingredient

In the method Nuclear Magnetic Resonance (NMR) spectroscopy is used, which is a well-established and invaluable analytical tool that can be used to identify and quantify small molecule compounds. Among others, the presence and level of the compounds, metabolites, which can be found in human blood and urine, can tell you about your health. However, the way the NMR data is currently recorded is rather slow and, in some cases, also unsuitable for obtaining correct quantities.

Other researchers have previously investigated whether there are chemical compounds (adjuvants) that could speed up the analysis. Until now, it has not been possible to find an adjuvant that does not interfere with the signal, and thus cause the quantities of metabolites that you are measuring to be inaccurate.

Frans Mulder is the first to have identified a suitable adjuvant that can restore the innate quantitative nature of the NMR technique precisely, and at the same time make it possible to shorten the time it takes to record the data.

READ ALSO: NMR data as a benchmark in the jungle of protein structure prediction programs.

Contrast agent as an adjuvant

In his search for this adjuvant, Frans Mulder searched for so-called paramagnetic molecules. The literature led him to complexes of the element gadolinium (Gd), which is a constituent of a contrast agent used in hospitals for MRI scans. In the new research, the contrast agent is added to the urine sample and not the patient to be examined.

"We measured the amounts of small metabolites in both simple fluids, as well as in urine samples from healthy subjects. The use of these so-called 'contrast agents' in the analysis of the urine samples resulted in the measurements being quantifiable. Since the new approach is both versatile and inexpensive, researchers in other laboratories can easily apply it, thus leading to more efficient analyzes of urine samples both in population epidemiology as well as in personal medicine," says Associate Professor Frans Mulder.

Frans Mulder's research was made possible through a sabbatical stay at one of the best European NMR research infrastructures. With endorsement from the department and financial support from the Aarhus University Research Fund, it became possible to go in a new research direction and achieve an excellent result.

Credit: 
Aarhus University

Music charts are increasingly short-lived

Since the 1960s, music charts have been compiled using the same criteria - sales profits. Charts therefore provide sets of comparable data spanning many decades, a circumstance that makes them particularly well-suited for the investigation of the long-term development of cultural time scales. This approach is also relevant beyond the cultural domain, in particular with regard to the pace of political opinion formation, which affects the dynamical stability of liberal democracies.

In a new article published today in Royal Society Open Science, Lukas Schneider and Professor Claudius Gros from the Institute for Theoretical Physics at Goethe University demonstrate that the statistical characteristics, the composition, and the dynamics of the US, UK, Dutch and German pop album charts have changed significantly since the beginning of the 1990s. On the one hand, the diversity of the charts has doubled, or even tripled: Now there are significantly more albums making it to the top 100 or top 40 in a year. On the other hand, we now see that an album either starts off immediately as number one - or never reaches the top. In contrast, from the 1960s to the 1980s, successful albums needed four to six weeks to work their way from their starting position to the top slot.

The nature of an album's "lifetime" - the number of weeks an album has been listed - was found to have changed profoundly. Before the 1990s, the statistics of album lifetimes were governed by a Gauss distribution with a logarithmic argument (log normal). Today, the distribution of album lifetimes is characterised in contrast by a power law. The distribution of lifetimes is therefore universal, i.e., independent of the specifics of the process, a key characteristic of the final state of a self-organising process. To explain this development, the authors propose an information-theoretical approach to human activities. The assumption of Schneider and Gros is that humans strive continuously to optimise the information content of their experience and perceptions. Mathematically, information is captured by the Shannon entropy. According to Schneider and Gros, one needs to consider furthermore the Weber-Fechner law, which states that time and other variables are represented and stored in the brain not in a one-to-one ratio, but in a greatly compressed way (on a logarithmic scale). In this view, optimization of the information content of compressed experiences explains the observed chart statistics.

Overall, the study by Schneider and Gros shows that chart dynamics has accelerated, proceeding today substantially faster than several decades ago. The authors conjecture that a similar acceleration could also be at work for the underlying socio-cultural processes, such as social and political opinion formation. As an earlier work by Gros shows, this could threaten the dynamical stability of modern democracies, as the functioning of democracies is based on reliable temporal ties between electorate and political institutions. These temporal ties are threatened when the time scale of political opinion formation and that of the delayed decision processes drift increasingly apart (Claudius Gros, Entrenched time delays versus accelerating opinion dynamics: Are advanced democracies inherently unstable? European Physical Journal B 90, 223 (2017).

Credit: 
Goethe University Frankfurt

Protein aggregation: Protein assemblies relevant not only for neurodegenerative disease

image: Cross section of the 3D model of an amyloid fibril against the backdrop of a cryo-electron microscopy recording. A PI3K SH3 domain is highlighted in yellow.

Image: 
FZJ / Christine Roeder

Proteins are central components of living material. These complex molecules made up of combinations of individual amino acids in some cases comprise thousands of individual atoms and have sophisticated three dimensional shapes. The term 'fold' is used to describe this structure. The fold of a protein determines its biological function.

Misfolding into non-natural structures and associated aggregation makes proteins not only useless but potentially toxic. The current view is that many neurodegenerative diseases are triggered by misfolded proteins. They form deposits in critical parts of the central nervous system. Initially, fibrillar structures, referred to as 'amyloid fibrils', form. Larger deposits of such amyloid fibrils form the typical plaques that can be found in the brain tissue and can restrict, damage or kill nerve cells.

The PI3K SH3 domains are usually part of larger proteins, but can also exist alone in their correctly folded form. They play a major role in cellular communication. For many years, these domains have been used as model systems in order to examine protein folding and thus determine the causes of misfolding. This is because researchers have discovered that these domains can also form amyloid fibrils that do not differ from the fibrils typical for diseases and are just as poisonous to cells. In fact, all proteins can potentially form amyloid fibrils; healthy organisms must actively and constantly combat this process.

Many fundamental discoveries of amyloid fibrils that are directly applicable to disease-related proteins were made using this model system. "But what we didn't know until now was the precise three-dimensional structure of the fibrils from the PI3K SH3 domains," explains Prof. Dr. Gunnar Schröder, Professor of Computational Structural Biology at HHU as well as work group leader at Forschungszentrum Jülich.

"Now we can use cryo-EM to understand these structures fully," adds Prof. Dr. Alexander Büll, corresponding author alongside Schröder of the study published in Nature Communications. Büll was an Assistant Professor at HHU until early 2019 and is now a Full Professor in the Department of Biotechnology and Biomedicine at the Technical University of Denmark in Lyngby. Speaking about the significance of this determined structure, Prof. Schröder adds: "Now that we know the spatial structure, much of the earlier data from the last 20 years can be reinterpreted or interpreted more quantitatively."

"Cryo-electron microscopy is a wonderful tool for determining the three-dimensional structure of the fibrils," emphasises Christine Röder, first author of the study and a member of Prof. Schröder's work group in Jülich. The 2017 Nobel Prize in Chemistry was awarded for the development of this method, with which complex biomolecules that adopt their natural form only in an aqueous environment can be presented in atomic resolution. The samples dissolved in water - for example proteins - are plunge-frozen to very low temperatures and thus fixed in their natural structure. This makes it possible to examine them under an electron microscope in this state. However, this can't be achieved with a single image, but in a succession of several recordings that show the protein from different angles. Computers then put the many individual recordings together to form a three-dimensional image.

Credit: 
Heinrich-Heine University Duesseldorf

Environmental DNA proves the expansion of invasive crayfish habitats

image: Non-native signal crayfish (left) were found upstream of a 69 cm-high road-crossing culvert (right).

Image: 
Kousuke I. et al., <i>Freshwater Science</i>. July 24, 2019

Environmental DNA (eDNA) has successfully proven the presence of invasive crayfish in almost all the small streams around Lake Akan in Japan, suggesting that eDNA analysis is an efficient and highly sensitive method to assess the distribution of aquatic organisms.

Researchers from Hokkaido University have found that signal crayfish, Pacifastacus leniusculus, may have endangered the habitat of Japanese crayfish, Cambaroides japonicas, through a survey method utilizing eDNA . The results suggest that signal crayfish are more widely distributed than Japanese crayfish in the streams around Lake Akan.

Freshwater ecosystems face a multitude of threats to their habitats. More than a quarter of all endangered species are among freshwater animals, roughly around 4,600 species worldwide. Biological invasions are a major threat to biodiversity loss in many ecosystems, including freshwater. eDNA is a genetic material extracted directly from environmental samples such as soil, water, air or leftovers from organisms. Lately, eDNA analyses are becoming a practical and cost-effective method for surveying aquatic species and efforts are being made to improve eDNA analyses and clarify their limitations. In previous studies, the distributions of the signal crayfish and Japanese crayfish have only been assessed by capturing specimens by hand without applying eDNA methods.

This study, led by Junjiro Negishi and his colleagues from Pacific Consultants Co., Ltd. and the University of Hyogo, investigated current distribution patterns of Japanese crayfish and signal crayfish in the streams around Lake Akan by both conventional hand capturing method and eDNA analysis. The results have been published in Freshwater Science.

Species-specific eDNA primers were first developed to detect and distinguish species. 17 streams were then surveyed by both methods. The surveys showed that signal crayfish are now largely invading 16 of the 17 streams, a total of 19 of the 22 sites (86%).

"Unexpectedly, signal crayfish DNA was detected at almost all the sites, even upstream of a fast current, steep slope, and a road-crossing culvert with a great drop-off height, suggesting the expansion of its distribution range in the past decades," says Junjiro Negishi. "eDNA of one or both species was found at some sites where no individuals were caught by hand, showing that eDNA can identify the presence or absence of aquatic organisms more reliably than conventional survey methods."

Preventing the invasion of non-native species on native species is generally less costly than post-entry control and eDNA analysis could be advantageous due to its potential for rapid and accurate monitoring. "In our study, culverts had limited effectiveness against signal crayfish invasion. So, active species-specific removal projects may be an effective option to minimize the negative impacts of signal crayfish where they are already present," the researchers concluded.

Credit: 
Hokkaido University

Color-changing artificial 'chameleon skin' powered by nanomachines

video: These are artificial chromatophores switching color.

Image: 
University of Cambridge

Researchers have developed artificial 'chameleon skin' that changes colour when exposed to light and could be used in applications such as active camouflage and large-scale dynamic displays.

The material, developed by researchers from the University of Cambridge, is made of tiny particles of gold coated in a polymer shell, and then squeezed into microdroplets of water in oil. When exposed to heat or light, the particles stick together, changing the colour of the material. The results are reported in the journal Advanced Optical Materials.

In nature, animals such as chameleons and cuttlefish are able to change colour thanks to chromatophores: skin cells with contractile fibres that move pigments around. The pigments are spread out to show their colour, or squeezed together to make the cell clear.

The artificial chromatophores developed by the Cambridge researchers are built on the same principle, but instead of contractile fibres, their colour-changing abilities rely on light-powered nano-mechanisms, and the 'cells' are microscopic drops of water.

When the material is heated above 32C, the nanoparticles store large amounts of elastic energy in a fraction of a second, as the polymer coatings expel all the water and collapse. This has the effect of forcing the nanoparticles to bind together into tight clusters. When the material is cooled, the polymers take on water and expand, and the gold nanoparticles are strongly and quickly pushed apart, like a spring.

"Loading the nanoparticles into the microdroplets allows us to control the shape and size of the clusters, giving us dramatic colour changes," said Dr Andrew Salmon from Cambridge's Cavendish Laboratory, the study's co-first author.

The geometry of the nanoparticles when they bind into clusters determines which colour they appear as: when the nanoparticles are spread apart they are red and when they cluster together they are dark blue. However, the droplets of water also compress the particle clusters, causing them to shadow each other and make the clustered state nearly transparent.

At the moment, the material developed by the Cambridge researchers is in a single layer, so is only able to change to a single colour. However, different nanoparticle materials and shapes could be used in extra layers to make a fully dynamic material, like real chameleon skin.

The researchers also observed that the artificial cells can 'swim' in simple ways, similar to the algae Volvox. Shining a light on one edge of the droplets causes the surface to peel towards the light, pushing it forward. Under stronger illumination, high pressure bubbles briefly form to push the droplets along a surface.

"This work is a big advance in using nanoscale technology to do biomimicry," said co-author Sean Cormier. "We're now working to replicate this on roll-to-roll films so that we can make metres of colour changing sheets. Using structured light we also plan to use the light-triggered swimming to 'herd' droplets. It will be really exciting to see what collective behaviours are generated."

Credit: 
University of Cambridge

California's rooftop-solar boom leaves equity gap

OAKLAND, CA - California leads the nation in the adoption of rooftop solar systems, but information on which communities do, and do not, benefit from these installations has been limited to broad income classifications and anecdotal observations. Now, the data is in: The adoption of distributed solar - rooftop installations as opposed to industrial-scale operations like solar farms - is closely correlated with socioeconomic status as well as with health, environmental and demographic indicators. The study, published online August 20 in Energy Policy, is the first peer-reviewed analysis of distributed solar adoption in disadvantaged communities.

Solar adoption rates in California's 5 percent most disadvantaged communities are more than eight times lower than in the state's 5 percent least disadvantaged communities, researchers found, as identified using CalEnviroScreen, the state's environmental justice screening tool. Researchers analyzed solar adoption at the census-tract level and they also looked closely at which characteristics of these communities were correlated with solar adoption.

"Income is a big factor in predicting who adopts solar, but it's not the most important factor," said lead author Boris Lukanov, PhD, a researcher at Physicians, Scientists and Engineers for Healthy Energy (PSE), the Oakland-based energy science and policy institute. "There is a combination of factors that are more powerful together than any one individually." In addition to poverty, housing burden, linguistic isolation and low educational attainment rates are three key factors, that, taken together, are correlated with low rooftop solar energy adoption rates in communities across California. "That suggests that it's not only about financial resources or socioeconomic status. It's about information and access to information," Dr. Lukanov said.

The findings have implications for how the state incentivizes solar in the future. Historically, California utilities have directed clean-energy subsidies towards "low-income" households - those earning 80 percent or less of a region's median income, according to California Department of Housing and Community Development. The study indicates that these programs have been insufficient to reduce inequities in access to rooftop solar, despite recently improving trends.

By comparison, CalEnviroScreen, a tool developed in 2013 by the California Office of Environmental Health Hazard Assessment, measures multiple factors that can contribute to a community's vulnerability to pollution and its effects compared to the general population, including income level, linguistic isolation, rates of education and employment, health indicators such as asthma rates, and environmental burden indicators such as proximity to hazardous facilities.

California uses CalEnviroScreen to channel State cap-and-trade funds towards clean energy investments in disadvantaged communities. But limited data has hindered an analysis of the success of past programs. Now, newer programs such as the Disadvantaged Communities-Single-Family Affordable Solar Homes program, which incentivizes homeowners in disadvantaged communities to go solar, can be compared to the relevant existing adoption rates.

"Our study provides the first apples-to-apples baseline to measure how well new programs perform," said Elena Krieger, PhD, director of PSE's clean energy program and study co-author. "Accurately tracking the efficacy of existing and new solar programs will help us develop more effective policies and strategically design incentives to target disadvantaged communities."

The correlations in the study are associations, not causes, the researchers stress, but they say the findings identify specific issues to further explore so that policies can be aimed squarely at overcoming barriers to solar adoption.

Health, resiliency, and grid co-benefits

Understanding and overcoming the barriers to adopting solar photovoltaic energy generation is key to ensuring that the state's most disadvantaged communities also get the benefits of rooftop solar that more privileged communities have enjoyed for years, Dr. Krieger said. In addition to lower bills, consistent bills can help households avoid electricity cut-offs by avoiding costly seasonal spikes. In some communities where local loads are served by nearby gas power plants, replacing these polluting facilities with local storage and distributed solar can positively impact local air quality, reducing a community's environmental burden.

Maximizing the potential of clean energy technologies is also an important step in reaching California's ambitious emissions-reduction goals.

"And it may provide benefits to the grid," Dr. Krieger said. "You'll increase solar, along with storage, in places that may not have had it and you'll have more of an equal distribution of energy sources." That can increase a region's resilience during electric shut-down events from fires and extreme heat, Krieger says, which are predicted to increase as the climate warms. "If we can figure out how to access these communities for solar, ideally we can leverage those relationships and policies to set a precedent for adopting storage and other emerging technologies, increasing equitable access going forward."

Other states are already emulating California's focus on environmental justice. Michigan is piloting an environmental justice tool, and New York is starting to incorporate environmental justice considerations into how it sites power plants.

"California is setting a precedent here, so the implications are broad - for how to encourage equitable solar adoption and how to implement all clean energy technologies," Dr. Krieger said.

Credit: 
PSE Healthy Energy